Microsoft Rebuilds Bing Webmaster Tool
On 21st July/2010, Microsoft launched a revamp of their Bing Webmaster To.
Here is a snippet from Bing Webmaster Tools announcement explaining what’s included in the brand new version:
The redesigned Bing Webmaster Tools provide you a simplified, more intuitive experience focused on three key areas: crawl, index and traffic. Brand new features, such as Index Explorer and Submit URLs, provide a more comprehensive view as well as better control over how Bing crawls and indexes your sites. Index Explorer gives you unprecedented access to browse through the Bing index in order to verify which of your directories and pages have been included. Submit URLs gives you the capability to signal which URLs Bing should add to the index. Other brand new features include: Crawl Issues to view details on redirects, malware, and exclusions encountered while crawling sites; and Block URLs to prevent specific URLs from appearing in Bing search engine results pages. In addition, the brand new tools take advantage of Microsoft Silverlight 4 to deliver rich charting functionality that’ll help you rapidly examine up to six months of crawling, indexing, and traffic data. That means more transparency and more control to help you make decisions, which optimize your sites for Bing.
Features of Bing Webmaster Tool:
It shows traffic data of last 6 months. It shows impressions, clicks, Query and CTR.
Microsoft provides upto six months of data but the worst part is that you can just view the data screen and you just can’t download it. The data is there but you can’t do any effective analysis out of it.
It allows you to explore web site in directory and pages in Bing index.
The Index Explorer enables you to view the specific directory and pages of your site that are indexed in Bing. Again it can be useful to drill in to this data, but it would be significantly more useful if it were downloadable. When you click on a URL, you see a pop up with controls to block the cache, block the URL and cache, and recrawl the URL.
This feature allows you to submit 10 urls in a day and 50 urls in a month. Although, you have submitted XML sitemap while will list out all the urls of a web site you want to get indexed but this feature further supplements a webmaster to get quick attention of Bot to index more fastly.
It allows you block any urls from Bing indexing. You can “block url and cache” and only “Block
Cache” and thirdly you can further get a url unblock as well. It reminds me about “Google’s removal URL feature”
Overall Microsoft did a good job but still there is a lot of scope of improvement like
· There should be back links report
· Robots.txt validator should be introduced
· There should not be any requirement for Silverlight because it is working well with IE but not with Chrome. It is seriously frustrating.
· It should allow exporting data so that a webmaster can review, analyze and further fine tune their strategy