slinqs! collects information on many things on a site and proceeds to evaluate it. For every site analyzed and evaluated, it looks into 7 sections; domains, contents, behaviour, SEO, social shares, header and similar sites.
At a domain level, we look into registration date, renewed date, expiry date from whois lookup. Additionally the information on TLDs, the number of characters in a domain, server location and IP addresses are displayed. Sometimes, we cannot get exact latitude and logitude of the server and the server map might not point the exact location.
Starting from Alexa ranking, here one can see the traffic level of the site and the users behaviour.
Alexa Rank has 2 elements: global and local. If a site is a Japanese site, the local ranking refers to the raking within Japan. Google Analytics Code shows the ID of the account.
Access information contains pageviewes per visit, bounce rate and time on site obtained from Alexa. Sites with low traffic might not display these data. The main pages section lists important indexed pages of the domain.
The referer information helps you grab an idea of from which sources it gets visitors. Similarly, the keyword section has a list of keywords by which the site gets visitors from search engines.
This section shows very general performance of sites in terms of search engine optimization.
It shows the number of pages indexed in the search engines, Pagerank, and the number of domains that links to the site. Meta and directory registration information shows the meta keywords and description of the home page of the site and the category where the site appears in Dmoz directory. The file of the sitemaps and robots.txt if identified, are also shown here.
SEO Analysis studies very basic on-page elements; heading tags are used in the right way, alt attributes are placed in case of images, the title is within the right length, etc.
Below that, we have a list of pages that contain links pointing to the site in question. It merely a small portion of the link profile of the site.
The changes in the indexed page volume is in the graph section. On a weekly basis, we check the number of pages indexed in the list and that allowed us to show the graph of the progression. However, the numbers shown in the graph might not correspond to the real number and rather it is less than the real volume. Below the graph of the indexed pages, one can have a look at the list of link distribution on the home page and also the list of extracted keywords. We analyze the href tags within the home page with dofollow, nofollow attributes divided into four groups: global, images, internal links and external links.
The keyword extraction gets you a list of keywords that could be the important to the home page. However, it has not much to do with the keyword density. Sometimes the list contains some keywords not that important and unrelated to the page.
This section shows a list of related sites.
If the site or domain exists within the slinqs! database, the link takes you to the site review page of that site for comparison.
If it does not exists in slinqs!, simply you are taken to the site. Nevertheless, you can go to the top bar and type in that site name and get the review instantly and compare the site metrics of two similar sites. From that moment on, the site you checked will be programmed for weekly crawl and you will be able to get an update of the site review every week.
Contents section shows contens found on the home page of the site. Screenshot and colors display the thumbnail and the overall color usage of the site in images. If a site gets a design change. It might take a week or so to get the thumbnail and color refreshed here.
The pie chart shows the tag ratio of the page. The content here means the texts users can see on a browser and tags are the html elements unvisible to users. Additionally, the content length section shows the number of characters, the number of characters without tags and the word count on the home page.
Contents table lists what is in title element and the heading elements. If the elements are used in the correct way, just looking at this table should give you a rough idea of what the site is about. Below this table, we have the raw html of the page.
The changes in contents tells you how the homepage of the site has been changing over time. If the line of the graph moves upwards, the site had changed its content since the last crawl. The higher the number, more changes found since the last content check. If there is no change since the last crawl, the change value is 0.
Towards the end of the content section, there are two tables with internal link and external link information. Here you can check to what internal/external pages the site is linking to. The last table shows the Html/css errors found on the home page of the site. You get the summary on the volume of errors and warnings by w3c and the table of the detail of each error below that.
This group of information shows how popular the site is on social network platforms.
The shares are obtained and displayed from Facebook, Twitter, Google+ and Hatena Bookmark.
As long as the twitter account information is found within the <head> we display the Twitter account information of the site including the number of tweets, followers, following and its bio. The Facebook popularity information is also displayed. You can have a look at how many times the site has been liked, shared, commented and visited from Facebook to the site.
The transition of the social shares are shown in the graphs. By these graphs one can understand if a site is growing in the number of shares and gaining more social status on the social network platforms.
The section displaus mainly the header information and the site speed.
In the site speed, slinqs! makes a request to the site and wait for the server to respond. This response time is the site speed here. The graph shows the last 6-7 requests and its response time.
The table Header Info lists the contents found in the header request; Connection, doc type, content type, encoding, web server, programming language. The web server and programming language information often are not displayed since the header information does not tell you how the site is built.