Development experience 13 years, 620,000 site above, there is a 10 billion or more pages of crawling experience.
From large to small-scale, it has been introduced as a Web marketing tool of the various companies.
In many companies and organizations as a new infrastructure of the big data era it has been continuously utilized.
|Cloud-Ready||Because the cloud-enabled that can be set from anywhere at anytime|
|Parallel crawling||Also supports crawling tens of millions, several hundreds of million page scale by the parallel crawling|
|Crawl corresponding to any site||The browser auto-manipulation language of its own development that Keywalker.js, can also extract information from a dynamic site|
|Flexible crawling setting function||
|You can acquire a variety of document file format||HTML、RSS、SITEMAP、PDF、Office、and so on…|
|Tableau cooperation (optional)||The acquired data, in conjunction with Tableau (BI tool) analysis and visualization by|
Hearing investigation of workflow in the field (free of charge)
Rough estimate of when it is automated (free of charge)
Based on the workflow definition, building automation system
The installation process (user PC environment or our cloud environment,)