featured-work

Website Audit

www.bbsp-refurbishment.co.uk
0Important fixes
9Semi-important fixes
15Crawled pages
139Passed checks
Domain overview
Domain characteristics
IP address
104.27.182.59
Server location
USA
Expiration date
2020-09-25
Web archive age
2018-08-11
SEO metrics
Yandex X
0
Moz DA
3
Alexa rank
0
Backlinks
23
Index status
Google
32
Bing
41
Yahoo
41
Yandex
21

Health check

0Important fixes
0Semi-important fixes
Wondering how well your website is performing? Here you can check your website against the major SEO parameters and see whether your site has the proper WWW redirect, SEO and user-friendly URLs with trailing slashes, functional robots.txt file, XML sitemap, duplicate content, etc. Take advantage of the website health check to eliminate all possible errors, and follow our recommendations on how to increase your search engine visibility.
WWW redirect
This option checks whether your website runs with www or without www in order to avoid the usage of duplicate content.
URL rewrite
Make sure to rewrite your URLs that will help users and search engines see clear and clean URLs.
Your website’s URLs shouldn’t contain vague elements that will make them difficult to read. All URLs should be clean and clear for users. With SEO Friendly URL it is easy to rank your website in search engines and share different articles with clear URLs in social media. Don’t use underscores in your URLS.
Trailing slashes
If you use the trailing slashes at the end of the URL it makes it pretty and clear, while the URL without slash will look quite weird.
Redirects HTTP traffic to HTTPS
HTTPS is a guarantee of user data protection (including payment details), as well as a good ranking signal for search engines. In 2016, Google officially announced that since January 2017 a secure communication will become the official algorithm. So, sites on HTTP will gradually lose their positions.
Robots.txt
Robots.txt file helps to restrict access for the selected search engine robots and prevent them from indexing specific pages or the whole of the website. Robots.txt file contains the link to the XML sitemap file that helps search engine crawlers to discover and index the maximum number of the website’s pages.
XML sitemap
You should specify the change frequency for each URL, time when you have made the last changes and the importance of the URLs. You will help search engines to crawl the site easily and intelligently.
No flash
Optimizing your content for search engines, we recommend you to avoid Flash as much as possible. It is better to use Flash for specific enhancements as search engines can’t index properly Flash content.
No frame
Try not to use frames on your website as search engines can’t index or crawl your website content via them.
Common homepage variations
Make sure that your website content doesn’t have multiple URLs as Google will consider it as duplicate content.

Pages analysis

0Important fixes
0Semi-important fixes
Here you can get all information about your website for the last months. It shows possible errors Google robots can meet while crawling your website. You will get all stats about the pages with too long URL, blocked robots.txt, too big size, Noindex meta tags, rel="canonical", rel="alternate", meta refresh redirect and etc.
8 Pages with 2xx response
4 Pages with 3xx response
0 Pages with 4xx response
0 Pages with 5xx response
0 Pages with too big size
0 Pages with too long URL
0 Pages blocked by robots.txt
2 Pages blocked by meta noindex
0 Pages blocked by meta nofollow
0 Pages with meta refresh redirect
10 Pages with rel="canonical"
0 Pages without rel="canonical"
0 Pages with the same canonical URL rel="canonical"
0 Pages with duplicate rel="canonical" tag
6 Pages with rel="alternate"
0 Pages with the hreflang attribute
6 Pages with no hreflang attribute
0 Pages with errors in the hreflang attribute
0 Pages with mixed content

Meta analysis

0Important fixes
0Semi-important fixes
Make sure your titles and meta description have unique content and contain 10-70 characters for titles and 50-320 characters for meta description. All your titles and meta description should contain your important keywords. Your meta descriptions will influence your search results. Check your titles and meta descriptions with Google Search Console for warning messages about duplicate content.
0 Pages with duplicate title
0 Pages with missing or empty Title
0 Pages with Title too long > 70
0 Pages with Title too short < 10
0 Pages with missing or empty Description
0 Pages with duplicate Description
0 Pages with Description too short < 50
0 Pages with Description too long

Content analysis

0Important fixes
4Semi-important fixes
Keep in mind that unique content plays a vital role in search engine results. You website structure should include HTML headings (h1-h6). Try to use important keywords in your headings. The tag H1 must have the most important keywords. Don’t use duplicate content for your heading tags. The best variant is just to use h1-h3.
Write only unique content, no duplicate and rewritten content. Keep your content with 400 words and more. Too long articles are not good either. It is better to keep a happy medium. Try to minimize the number of spelling and grammar mistakes.
0 Pages with duplicate content
0 Pages with empty H1 tag
0 Pages with no H1 tag
0 Pages with duplicate H1 tag
0 Pages with empty H2 tag
0 Pages with no H2 tag
0 Pages with H2 too long

Links analysis

0Important fixes
0Semi-important fixes
Make sure you have no more than 100 external links per each page. Try to link only to quality website. Use Nofollow tag for external links if you don’t want to show unsolicited links to search engines. Optimize the anchor text of your links with important keywords. Avoid keyword stuffing. Don’t use too spammy links for your website. Analyze the structure of your interlinking here: Google Search Console: Search Traffic > Internal links.
0 Pages with too many outgoing links > 100
0 URLs with the excessive number of redirects > 5
0 Internal links with missing anchor
0 External links with missing anchor
0 Internal links use rel="nofollow"
0 External links use rel="nofollow"
0 Pages with no inbound internal links
13 External links use rel="dofollow"
0 External links with 4xx status
0 External links with 5xx status
13 External links
7 Links in the XML site map

Images analysis

0Important fixes
0Semi-important fixes
Keep your Alt text and Image titles unique per each image. Use no more than 7 words for Alt text. Avoid keyword stuffing. Make your images informative and detailed with important keywords. Get good-quality images with a specified width and height for each one.
0 Images with missing ALT text
0 Images with 4xx status
0 Images with 5xx status

Page load optimization

For desktops
0Important fixes
1Semi-important fixes
Uses efficient cache policy on static assets
A long cache lifetime can speed up repeat visits to your page. Learn more.
Avoid multiple page redirects
Redirects introduce additional delays before the page can be loaded. Learn more.
Enable text compression
Text-based resources should be served with compression (gzip, deflate or brotli) to minimize total network bytes. Learn more.
Minify CSS
Minifying CSS files can reduce network payload sizes. Learn more.
Minify JavaScript
Minifying JavaScript files can reduce payload sizes and script parse time. Learn more.
Eliminate render-blocking resources
Resources are blocking the first paint of your page. Consider delivering critical JS/CSS inline and deferring all non-critical JS/styles. Learn more.
Efficiently encode images
Optimized images load faster and consume less cellular data. Learn more.
Page isn’t blocked from indexing
Search engines are unable to include your pages in search results if they don't have permission to crawl them. Learn more.
Defer offscreen images
Consider lazy-loading offscreen and hidden images after all critical resources have finished loading to lower time to interactive. Learn more.
Properly size images
Serve images that are appropriately-sized to save cellular data and improve load time. Learn more.
JavaScript execution time
Consider reducing the time spent parsing, compiling, and executing JS. You may find delivering smaller JS payloads helps with this. Learn more.
Minimizes main-thread work
Consider reducing the time spent parsing, compiling and executing JS. You may find delivering smaller JS payloads helps with this. Learn more
Avoids an excessive DOM size
A large DOM will increase memory usage, cause longer style calculations, and produce costly layout reflows. Learn more.
For mobile
0Important fixes
3Semi-important fixes
JavaScript execution time
Consider reducing the time spent parsing, compiling, and executing JS. You may find delivering smaller JS payloads helps with this. Learn more.
Uses efficient cache policy on static assets
A long cache lifetime can speed up repeat visits to your page. Learn more.
Minimize main-thread work
Consider reducing the time spent parsing, compiling and executing JS. You may find delivering smaller JS payloads helps with this. Learn more
Avoid multiple page redirects
Redirects introduce additional delays before the page can be loaded. Learn more.
Enable text compression
Text-based resources should be served with compression (gzip, deflate or brotli) to minimize total network bytes. Learn more.
Minify CSS
Minifying CSS files can reduce network payload sizes. Learn more.
Minify JavaScript
Minifying JavaScript files can reduce payload sizes and script parse time. Learn more.
Eliminate render-blocking resources
Resources are blocking the first paint of your page. Consider delivering critical JS/CSS inline and deferring all non-critical JS/styles. Learn more.
Efficiently encode images
Optimized images load faster and consume less cellular data. Learn more.
Page isn’t blocked from indexing
Search engines are unable to include your pages in search results if they don't have permission to crawl them. Learn more.
Defer offscreen images
Consider lazy-loading offscreen and hidden images after all critical resources have finished loading to lower time to interactive. Learn more.
Properly size images
Serve images that are appropriately-sized to save cellular data and improve load time. Learn more.
Avoids an excessive DOM size
A large DOM will increase memory usage, cause longer style calculations, and produce costly layout reflows. Learn more.

Usability and technologies

0Important fixes
1Semi-important fixes
This section covers usability and technical requirements that apply to a properly functioning website. Start from checking whether your website has a favicon in its place, valid markup and custom 404 error page. Also check your website loading and running speed as it directly affects the rankings, and test whether it's secure for browsing.
Favicon
Make sure that you have a favicon for your website and it is consistent with your brand. The favicon should give your website a professional look, an easy identification and branding.
Custom 404 page
Use a Custom 404 page. If you correctly create your Custom 404 page, it will help people find all proper content, provide other useful information and encourage them to stay your website further.
Website speed avg: 1020 ms
It is important to have a fast website speed as it has a great impact on your website ranking. Good website speed will bring you more visitors and reward you with high conversion rates.
Safe browsing
Try to avoid phishing and malware for your website. You can make your website work safe with Google Safe browsing.
W3C HTML validation
Make sure that you use valid markup that doesn’t contain HTML errors. With syntax errors you make your page difficult to get indexed by search engines.
You can check website errors with W3C validation service and make sure your website meets the web standards.

Free website audit

Want to know how your or your competitor's site ranks against a technical audit?

👋 Just let us know of the domain.


How managed
web design works

Everything you need to make your web presence a success: 9 tips why a managed website is best for your project


Frequently Asked Questions


You're free to cancel your subscription at any time. Upgrade whenever you need a higher plan or downgrade your plan if you wish to lose access to features of a higher plan after 12 months.

Since we don't charge you for the cost of development but only for the ongoing support, updates and maintenance, we expect you to stay with us for 12 months until our investment returns. If for whatever reason you wish to terminate your subscription before that period ends, that's ok, too. We'll take your site offline and cancel your subscription. No hard feelings.

If you want to end your journey with us after 12 months we cancel your subscription and send you the latest copies of your backup.

Managing your own content sounds great at first but to increase conversion it is paramount to have a good online reputation. A fully managed website doesn't just mean it receives regular backups, updates and security scans but by allowing us to manage changes to your content it will always be geared towards delivering optimum speed, design and SEO. Properly sized and compressed images, search engine friendly formatted content and user experience are only a few to consider here.

Plans are flexible. We can customize them to your needs. Let's say you need the features of our Start Plan but with more developer time or the other way around. No problem.

Weblings is proudly 100% remote saving the commuting and office costs while utilises a subscription-based model. It means a low monthly fee for you without the huge upfront cost and recurring revenue for us.