Ok there is an alternative to scrapping and filtering rows. In theory this will also be faster (I dont know the website structure). You can skip data scraping if you only want to go to the next page (click on the URL link in the Event Description)
In any HTML table the tableRow can be used instead of IDX and I see you have at tableRow selector on the Event Description UiExplorer screenshot.
Check if you have successfully navigated to “Event Result” table by using Element Exists in or Check APp State ( if you are using Moder Expierence) on the Table
Prepare your tableRow value as a variable and in the UiExplorer use the tableRow selector
Optional: Scrape Table using Data Scrapper. This way you can use the table to get the number of rows in the UI. Just for getting number of rows not for any filtering.
If you dont scrape the table use
The try catch will ensure that if the rowNumber is higher than the tableRow variable then your automation will not stop, but it means that you have come to the last row of the current page.
Within the for loop you can use Get Attribute and read the URL of the Even Description column at rowNumber (tableRow). This way you can extract the URL for each row. If you want to have it in your scraped datatable later you can also choose to do that.
You can also get the values (Get Attribute—>innertext) for your filter on Status by using the same logic but getting text of Status and Description.
So there will be some if conditions in this step to only get urls where your if conditions are met.
5. Pagination: If your table has multiple pages then you at the end of Step 3 click on next
We use this approach and has worked flawslessly. We use Step 3 so that we dont need to have a Try-Catch in Step 4.
Hope this helps!