← Back to context Comment by andrew_zhong 5 days ago I will add a PR to enforce robots.txt before the actual scraping. 1 comment andrew_zhong Reply messe 5 days ago Or just follow web standards and define and publish your User-Agent header, so that people can block that as needed.You're creating the wrong kind of value. I really hope your company fails, as its success implies a failure of the web in general.I wish you the best success outside of your current endeavour.
messe 5 days ago Or just follow web standards and define and publish your User-Agent header, so that people can block that as needed.You're creating the wrong kind of value. I really hope your company fails, as its success implies a failure of the web in general.I wish you the best success outside of your current endeavour.
Or just follow web standards and define and publish your User-Agent header, so that people can block that as needed.
You're creating the wrong kind of value. I really hope your company fails, as its success implies a failure of the web in general.
I wish you the best success outside of your current endeavour.