Automate Servicenow website/page


I have to select cases one by one from ServiceNow website and do some steps. Hope you understand by saying service now website or you can say a webpage of service now.
The servicenow page showing a table with cases in a div table and it contains multiple pages.

I’m confuse about which method is best to use…
Im trying Extract data first by choosing case id column but when i run the output data table its not showing any data in this extracted table. Any answer why?

Plus am i using the right method or not? Or is there any other way to go throughout the webpage by selecting case id one by one?

Kindly help. Thanks

Did data scrapping method helped us on this

Did u tried the SNOW connector made by @Cristian_Negulescu.
They are much more reliable than UiAutomation and would work in background.

Thanks … problem is due to security issue i have no rights to install any software on machine as I have no administrative rights… but is there a way i can use without administrative rights then do let me know.

I was confused that should i use data scraping or data extraction?

1 Like

Data scrapping

ok but then i have put the data back to an excel to get data by row right? I can not do without putting data to excel or can i?
Reason i ask is, this service now will be updated from an other robot after half hour so data position will not be same and maybe 10 more row comes in within this half hour… how will that work if i use stored data of a scraped data from excel.

1 Like

yah that can be done like, write that data extracted using data extraction method to a excel file placed in a folder with WRITE RANGE activity

as you say another robot after an hour is going to process the data from excel and if any new 10 rows has to be added by this current robot of course that can be done again with DATA SCRAPPING and getting the datatable as output and write that datatable back to the same excel file with APPEND RANGE activity
because the data structure that we are going to obtain in that half an hour is any how going to be same as what we obtained at the first so append range to the excel can be done and after one hour if another robot is set to run the data from excel, it will include the new one as well

Cheers @Latif

I think I didt explaned good…

An RPA robot from orchestrator is working on to filed the cases on that perticular service now site from where the new RDA robot will read those SN cases one by one and do the Sap automation steps.

So question is if I use data scrapping and add those data to excel and then go one by one … then It will crash (I feel so) because if meanwhile the RPA robot have added new 20 rows then the scrap data is not same and excel data is moved to 2nd page.
Half hour is just as example maybe this RPA robot is working full time.

do you got it what i means by?

if two robots are involved in this, and if bot involves in gathering data and one process it then we can use the concept of DISPATCHER and PERFORMER

–we can use QUEUES where our first robot can add the data to the queue which is termed as dispatcher and
–another robot can take simulatenously from the same queue with another robot and process that robot which is termed as performer

if this is possible then we can go for data scrapping without any hesitation and add the data from datatable one by one for each row to the QUEUE

did that sounds good
kindly correct me if i m still wrong with the question

Cheers @Latif

you got it but… problem is…

The first RPA robot is running on orchestrator… (its comparing data from database and do one process in SAP and 2nd in Service now…)
The 2nd is RDA robot and will not run on orchestrator… (it will look the data on SN and then do the end in sap)

I have done the SAP part with help of you :slight_smile: so now I have to read the SN case number and other case detail and will use those detail in SAP to do the further proces… after done with this 1st case I have to follow the same case with 2nd SN case number and on and on.
When I will click on SN case number it will open the case detail page and from there I will read the data i want and read the pdf an so on…
After collecting that data I will move to SAP and do steps then will return back to the same user case on SN and will write the SAP result/status and then will close the SN case.
after that next to a new SN case.

So that is my robot should do.

May i know how this process is triggered if not from orchestrator

a person will start every day in the morning :slight_smile:

1 Like

then this can be done as simple without involving queues here
–we can store the SVN cases in a excel the previous day and if a person is triggering that bot then that bot can read that excel file and process the cases one by one
in the service now by iterating through each row within FOR EACH ROW loop
–and as already said in the previous day, ie., before running the second part of the process, if any new data is to be added to existing excel then we can use APPEND RANGE activity
–once after processing the data in SVN then we can delete that excel itself from that folder using DELETE activity so that if any new data is added that can be processed the next day

Cheers @Latif


now help me I have 1page data from scraping in excel…

what to do next… I want to read the first row and then click on that perticular row sn number…
(do i need to attach page again or how)

we can even search for that id in searchbar of Service Now right
if so within the FOR EACH ROW activity use
–a assign activity to get the value of that case number
like this
str_case = row(“columnname”).ToString
–then use OPEN BROWSER activity where mention the ServiceNow URL as input and this will open the webpage we want and there we can search for this case and do the processing
–atlast withing the FOR EACH ROW loop use a KILL PROCESS activity with process name as “CHROME” if chrome browser is used or mention as “iexplore” in processname property

–so as this open browser is kept inside the loop for each iteration and for each row value new browser page will be opened and data will be fetched



Can I send you the process so you can check that i do correct or not?
It’s talking a lot time to get data scraped… so maybe you can help or maybe code need to change.

Sure @Latif

Hi Sourav,

I tried to use “Connect for ServiceNow” but I am unable to connect to the website… because i do not have username and password for that ServiceNow page.
Reasn i want to try that “connect to ServiceNow” is to see if i can get data faster then my normal code.
Right now my code without using this connecter is taking 1 minut to get data scraped even i only extract 2 column.

So can you help me if you have used it. thanks

Main.xaml (71.7 KB)

Check it. please…