Help with process (code review/fixing)

Hi guys

I have some issues with one of my projects and I was kinda hoping that the community coiuld point in the right direction. It’s about an online agenda for doctors, for each week it’s suposed to scrape the appointments and patientdetails and write it to a csv file, the scrapping used to work great, but I changed a couple of thing and now it’s verry unpredictable. The mainissue I have with it, I’m using a config.json to get certain vallues that are needed in the program, like the startdate and the enddate of the scrapping period, also at a certain date the appointments are being archived, when it reaches that point, it should click the archivebutton and keep scraping there untill it reaches the end date. I maded a function to check the date of the current scraping with the expected date and use that same function 3 times with different dates, but only with the archive date it seems to work, so the program never stops, when it reaches the enddate it just starts over again. For thez scraping, as I said that used to work but now sometimes it does sometime it dosn’t, I’ll include a video to demonstrate the flow. The description is, it opens a browser with a certain url it get’s from the jsonfile then it goes to the printpage to scrap the appointments for each day of a week, of the first scrapping I only need a date, time, first- and lastname, I use the printpage of the way the data is displayed, I tried to use scrapedate but everything is on one row and that maded the formatting to hard, each day I merge in a dt and at the end of the week I write it to a csvfile. Then I navigate to the startdate and click on ‘vertegenwoordiger’ (salereps) scrape all the salesrep appointments, go to consultatie do the same and again for dringend, there kept in different lists, on the background I then have a function that filters all the real appointments from the not important onces (like vrij of beluurtje) then should it for every appointment click on the corresponding field scrape the popup and if it’s a patient click on the profile button and scrape the missing information of the profile page, at the end of the week everything is written to csv, patients and salesreps are kept sepperateDoctena_SanMax.zip (52.8 KB) https://www.youtube.com/watch?v=4kr8p4cjUVA

Just a suggestion, why don’t you scrape the entire data and then filter it out using filter data table activity using the start data and end data from json?

1 Like

Cause it would never stop, even if there are no appointments anymore you can still click on the next button so it would scrape for ever

Ohh, got it

One more thing is, we have an option to limit the rows while scraping. You can do that as well. All these are alternatives to do the task. But not sure if all these serve your purpose

There’s no way to know how mutch rows it would need to scrape, or they’re would be, but I would need to calculate the number of weeks between the start and enddate and hope for the best.
De data is very incosistent, so I’m not even sure every week is only one row.
I’m not really looking to change things in a big way, unless they’re more efficient and sorry to say, but what you purposed dosn’t seem to be to me.I just can’t get my head arround why the same method works as expected when it needs to go to the archive, but fails when it reaches the enddate