Creating A New Worksheet In Each Iteration of a For Each Loop?

I currently have a For Each loop that takes a list of people’s names from excel, searches them on a website, and data scrapes the table that appears on the website and then appends range to a new, empty excel workbook. Using append range is very ugly as there are many lines to each table and its hard to see where one ends and the next begins.

So my question is: Is it possible to create a new sheet in the same workbook for each iteration? Preferably with the person’s name as the sheet name.

So I would have a sheet with John Smith’s info table called John Smith then another sheet called Jack Johnson with that person’s table info ect.

Hi @ds56 as I don’t know how the data you are using is formatted I can only give general guidance on this. You can use a variable in which you store the name of the person out of the datatable and then use the “write range” activity inside the for each (be sure to use an excel application scope!) and give the person’s name as the sheet name and then write the data table to that sheet.
This will create a new sheet for every iteration.

1 Like

Yes this is entirely possible:

  • For Each name in lstNames (where lstNames is your list of Names)
  • Perform your normal actions (scrape web date etc.)
  • Still within the For Each, instead of using append, use a Write Range activity - and make sure you chose the write range from the ‘workbook’ activities, not the ‘excel’ activities
  • use the following properties in the write range:
    • datatable will be the info you scraped from the webpage
    • Filepath will be the file path of the excel file
    • Sheet name will be ‘item’ - which represents the Name you are currently on in your lstName
    • make sure add headers is ticked

Therefore as it iterates through the list of names, it will go and scrape the data you need into a datatable, and then add that datatable(inc header) to a new sheet within the same excel file each time. you will end up with a sheet for each person, which contains the data you scraped. let me know how you get on!

1 Like

Thank you! That was a quick and easy fix and it worked perfectly!

1 Like

glad to help :slight_smile:

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.