Hi, I need to create a bot to support production of reports… it has about 400 variations with each one having a set of KPI… it will be dealing with around 50 data sources (some core system data, some excel spreadsheets etc). I can access the data, but what would be the optimal construct? the objective is for the bot to produce the data commutation and then individual reports. Would would be the best approach? thanks for any ideas to explore!
My suggestion is to allow better I/O operations:
A. Create template for each of the 400 variations
- Prepare a template that you can copy from a folder and work on the copied template
- Extract the relevant data to fill the fields of the copied template
- Predefined the default values in case the data extraction return NULL
B. Data preparation:
- Have a few first-tier robots to pull each of the 50 data sources
- Have the second-tier robot to filter, cleanse, sort, update, etc. the data into fields, tables, images, etc.
- Have the third-tier robot to duplicate the data source from second-tier robot (direction is one-way)
- Have the last-tier robot to extract from duplicate data source (I/O direction is one-way only)
- The first-tier robot will extract lots of raw data into a local machine
- The second-tier robot should convert the raw data locally (no network latency) into meaning data locally
- The third-tier robot to prepare a separate data store for read-only by the template-generating robot
- The last-tier or template-robot to generate the report from read-only data store