Data Retention policy of Data Service entity table

How do you implement data retention policy of DS entity table for below scenario.

Production bots are storing Key metrics of process in DS entity table after every 5 minutes run.

Objective is to backup data + create BI dashboard for business stakeholders.

@Sonalk - Data Service will never delete data automatically from an entity table. So, the choice for retention timeline is yours.

I am not sure what you mean by backup data & create BI dashboards?

If you want to backup data to another location like your enterprise data warehouse, my suggestion will be to setup an unattended job to run every X hour (maybe 4 or 6). The unattended process can then be used to query data from Data Service and write to your EDW. Once the write to EDW is finished, you can use the delete entity record (single or batch) activities to delete those records from Data Service. Once the data is in your EDW, you can build BI dashboards on top of it.

If you don’t want to remove data from Data Service and want to directly connect Power BI with Data Service, I had shared an example custom connector here - Access to Data Service with Power Bi - #19 by ankit.saraf

Thanks Ankit for your inputs.

We have 12 Data service units (each data service units grants us 1 Gb of data storage, 5 Gb of attachment storage and 10,000 API calls/day.

What is the best practice to backup data from Data Service entity table to avoid full consumption as per above details.

As i mentioned, data is added in entity table after every 5 minutes (everyday). I am just worried about the full consumption of DS units as per specifications.

@Sonalk - Yes, you are right that you will get about 12 GB of total data storage. Depending on how much data is added in each run, it will still be a long time before you will hit your overall 12 GB limit. (A simple back of the envelope calculation says that if you add 10KB of text data every 5 mins, it will still take 10+ years :slight_smile: )

Additionally, being an enterprise customer, we will not block your usage if you hit the limit but rather have someone reach out to you and work with you to get it back under compliance, so don’t worry on that aspect.

That said, if you want to backup your data to another location and remove it from Data Service, either using the Studio activity or using the APIs (REST compatible with swagger) will be both options available to you. We have provided batch ready (Query) and batch delete activities/APIs exactly for customer needs like these.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.