In analyzing so many things about a site, we use a number of tools and services on the Internet. Domains, indexed pages, social shares, traffic data, all these things you see on our page are based on the following sources.
Whois data and other domain stuff come from here. The domain registration date, expiry date and renewed date and also ip addresses and the server location originate from this source.
In some languages, we rely on the technology of Yahoo! to extract popular and important keywords whithin a page. In addition, we use it for lookup of the site registration in Yahoo! Directory.
The traffic volume, bounce rate, time on site, pageviews per visit, all the website analytics data is obtained from Alexa. Dmoz directory registration and backlink information can be looked up here, too.
Search engines crawl and index pages they find. We crawl the search engines to find out about the number of pages indexed about sites. By checking this on a daily basis, we are able to show the historical transition of the index volume of a site.
For Each site we review, we look up if it has html/css markup errors. We access w3c validator technology to do so. This can be done locally, too.
If a site has a twitter account, we access Twitter to get the bio, the number of tweets, followers, following etc. We also check how socially shared the site is.
We access Google+ to find out about how social popularity of the site.
Facebook likes, shares, comments are accessed and obtained through Facebook API.
social mentions and shares on a site are looked up on Hatena Bookmark, too.