Our friends over at Bing recently posted about the 9 areas of your website which you should be able to control as a website owner.
Whether you control them directly or you can influence them through your web developer is irrelevant…as long as you can have changes made in a reasonable timeframe.
This is list is especially useful for people who are in the process of developing a new website or redeveloping their existing website….
So here goes (in no particular order)
“Being able to tell the engine which version of your URL you’d like to have attributed as the original is pretty useful. This handy command can help you build value on the version of the URL that matters most to you, and help combine value attributed to many version of the URL into one location, helping boost the rank of that one, original version of the URL.”
Put simply, I don’t know how many times I’ve seen websites where the URLs http://www.yourdomain.com and http://yourdomain.com are pointing to separate versions of the same page. What an SEO nightmare. Get control of the Rel=canonical control and avoid this obvious mistake across your website.
“Seems like a no-brainer, this one, but so many websites remain without a robots.txt file. In some cases it’s a purely missed opportunity, or the site owner is unaware of what a robots.txt file is. In other cases, though, it’s the inability to place a file on the root of your domain.”
A robots.txt is a vital control for your website. It helps you restrict access, avoid duplicate content indexing and also minimizes bandwidth usage. For a detailed rundown on robot.txt usage – check out The Definitive Guide to Robots.txt for SEO
“This is another file missing from a huge number of websites today. Another important file the search crawlers look for. One that is referenced inside the robots.txt mentioned above, and one which can help get more of your pages into the search index. Overall, it’s almost as important as the robots.txt file, and if you cannot place these files in a location the crawlers can find, you need to fix this issue.”
Beyond helping more of your site get indexed, it also helps control how often the search engines return to re-index your website – which in the world of “real-time” search, is vital. Your sitemaps can also be submitted directly to the search engines.
“Marking up your content has been around for a few years now, and with the launch earlier this year ofwww.schema.org, the major engines have made a clear statement there is value in marking up your content. By embedding these elements in your page code, we can extract information more accurately and use that information to provide increasingly richer search results.”
While semantic mark up still hasn’t become a mainstream practice, it’s definitely the future of web development. The implementing this mark up is surely going to help you being indexed more accurately in the search engines.
Title, Description, Alt Tags, etc.
“Managing your title tags, meta descriptions, alt tags, etc. is still important. All these basic, on page/technical seo factors add up to help us understand what your content is relevant for. The bottom line with these items is you need to be able to manage them. If you cannot change these elements on a per-page basis, you lack needed control. “
Did someone say SEO 101 – if you haven’t got control of these page elements, you are so far behind the game its not funny. This stuff is the absolute basics…
“This might seem pretty obvious, but with so many website still aggregating content or using article services to build out pages, its worth mentioning. We talked about how to build good content a little while back and the value of “article-site content”, but we still see websites trying to get ahead in the rankings by basing their websites on a thin content model.”
If you have a website where you can’t control content, let alone create new content – it’s time for change. Content is still the king. And if you’re developing a new website, dont even consider a developer that isn’t offering some level of content management – period!
“This is pretty straight forward. You need to be able to place the verification code in place to use webmaster tools. This could be in the form of embedding a tag in the <head> code of your webpage, or by a notation added in the DNS for the website.”
The webmaster tool services offered by the search engines are a key part of any website owners toolkit. If you can’t verify – you can’t enjoy the benefits.
“If you don’t have a website your visitors love, you’re missing an opportunity. Get cracking on a user experience review and see where you’re bleeding users. By staying tuned in to what users like and dislike about your website, you can make the myriad small changes needed to field a UX-winning site.”
Even though this is slightly more advanced…it’s vital intelligence. With the range of great tools that offer this info (think Google Analytics and Google Website Optimizer), website owners have no excuses in being negligent of UX improvement.
Social sharing integration
“This almost goes without saying, but we still see so many websites not involved socially with their visitors. Social isn’t going away folks, and while it does take work, skipping social integration is a missed opportunity. People share what they like with friends. If you have social sharing icons embedded in your pages, you are much more likely to get shared by visitors.”
Social is what takes your web presence to the next level. If you’re not making it easy for your visitors to share, refer and discuss your business, you’re giving your competitors a massive advantage. Social implementation is fairly simple these days…so embrace it!
So that’s the 9 essential optimization controls…If you want to read the added info that Bing provided…head over to their blog post.