It seems that you have trouble getting an answer to your question in the first 24 hours.
Let us give you a few hints and helpful links.
First, make sure you browsed through our Forum FAQ Beginner’s Guide. It will teach you what should be included in your topic.
You can check out some of our resources directly, see below:
Always search first. It is the best way to quickly find your answer. Check out the icon for that.
Clicking the options button will let you set more specific topic search filters, i.e. only the ones with a solution.
Topic that contains most common solutions with example project files can be found here.
Read our official documentation where you can find a lot of information and instructions about each of our products:
Meet us and our users on our Community Slack and ask your question there.
Hopefully this will let you easily find the solution/information you need. Once you have it, we would be happy if you could share your findings here and mark it as a solution. This will help other users find it in the future.
Thank you for helping us build our UiPath Community!
Digging this thread back up because the Storage Buckets are so limited. We should be able to directly read Excel files and other types of files. This should be integrated into the activities like Excel Application Scope instead of having separate activities. All activities that have a source path property should be able to have that path set to a file in a Storage Bucket.
I want to store an excel file in storage bucket and read it via read storage text activity. But it is asking for encoding parameter and utf-8 didnt work out.
An actual Excel File is not simply just text though, if it was a CSV this would be text, but if the idea is you are going to modify the text, you’d still have to change it in memory (opposed to disk) and then Write it back to the Storage Bucket.
I’m not sure the aversion to Fetching the file and reading the contents local is? It is no different than a blob in a database, A Serialized JSON Object in a Queue Item, or more similar using Amazon S3 Bucket in which you need to fetch the Object before you can act on the content.
I agree that building a feature into Orchestrator that can allow Activities to reference and act on a Object in the Storage Bucket would be convenient, but really what is it saving? The Act of downloading and Storing on Disk, and the Cleanup of local temporary file afterwards. Using FTP/SFTP, NFS or other storage mediums is really no different and would mask these steps and most the time this is handled by the OS or Protocol/Implementation. Personally I rather work on the the files local to the Robot anyways when possible due to the lack of control of the performance of the network which can have a significate impact on processing capabilities depending on the size of files you are working with.