I think there is definitely some room for AI in this space.
Once you are scrapping or interacting with external services there always is the "staying up-to-date" problem.
Ideally you want your selection rules and rules for your actions to be invariant to little change in the user interfaces. This includes various anomaly monitoring like network error, captcha, UI-changes, anti-scrapping measures. An AI can help for that.
The goal is to become fire and forget. You can also extend the technology to become collaborative and do some analytics to leverage AI tools.
Once you are scrapping or interacting with external services there always is the "staying up-to-date" problem.
Ideally you want your selection rules and rules for your actions to be invariant to little change in the user interfaces. This includes various anomaly monitoring like network error, captcha, UI-changes, anti-scrapping measures. An AI can help for that.
The goal is to become fire and forget. You can also extend the technology to become collaborative and do some analytics to leverage AI tools.