The concept does not require me to program the logics, instead it starts reading a seed page, stores information/knowledge breaks it down to doable actions and performs action one by one.
This design allows the information to direct the concept. Where as, in case of web crawler, me/I will have to direct it.
A web crawler requires me to program the logics.
The concept does not require me to program the logics, instead it starts reading a seed page, stores information/knowledge breaks it down to doable actions and performs action one by one.
This design allows the information to direct the concept. Where as, in case of web crawler, me/I will have to direct it.
You want the system to have learned (discretion, taste...), but it cannot learn from a human programing logic, so it must learn either from
1. Something like Machine Learning
or..
2. Some emergent property of mathematics / computation.
If you find 2... hell that would be something.
Maybe genetic algorithms?
A good web crawler can just go through all links it finds, no programming is required.
define it