Share:

Glossary / Lexicon

How does a crawler work?

12/05/2022 | by Patrick Fischer, M.Sc., Founder & Data Scientist: FDS
A crawler is a program that navigates the World Wide Web. It follows links to other pages and downloads their content to analyze it. Thus, it collects information about the structure of the web and stores it in a database. For example, it can collect all available links or all words on the page. Afterwards, this database can be used for different purposes, e.g. for search engines, for price comparisons or for scientific research.
Like (0)
Comment

Our offer to you:

Media & PR Database 2024

Only for a short time at a special price: The media and PR database with 2024 with information on more than 21,000 newspaper, magazine and radio editorial offices and much more.

Newsletter

Subscribe to our newsletter and receive the latest news & information on promotions: