To prevent some errors indexing, Google is planning to launch a new crawler dedicated to the smartphone. This change does not affect the other robots, or the classic Googlebot , or those dedicated to “featured phones”.
Mozilla/5.0 (iPhone; CPU iPhone OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5376e Safari/8536.25 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
The Googlebot-Mobile for smartphones user-agent we will be retiring soon:
Mozilla/5.0 (iPhone; CPU iPhone OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5376e Safari/8536.25 (compatible; Googlebot-Mobile/2.1; +http://www.google.com/bot.html)
The new Googlebot for smartphones crawler will follow: robots.txt, robots meta tag, HTTP header directives. Google has analyzed that the update will affect less than 0.01% of URLS. It will also give webmasters more control over indexing & crawling of their content.
Twitter announced a new feature that it calls “broad match for keyword targeting” which allow advertisers running keyword targeted campaigns to reach users who are using synonyms alternate spellings, or “Twitter Lingo.”
For instance, if the coffee shop sells lattes but not espressos, it can use the “+” modifier on the broad matched terms to prevent broadening and targeting the wrong users. Targeting “love + latte” will match to users who tweet “luv latte,” but not those who tweet “luv espresso.”
Google has provided few suggestions in a new webmaster help video regarding how to recover from Google penalty due to spammy links.
Matt Cutts answered to the question
“How did Interflora turn their ban in 11 days? Can you explain what kind of penalty they had, how they fixed it, as some of us have spent months trying to clean things up after an unclear GWT notification?”
Matt Cutts posted a video explaining why Google no longer has that 100 links per page webmaster guideline.
Cutts says “It used to be the case that Google bot and our indexing system would truncate at 100 or 101K, and anything beyond that wouldn’t get indexed, and what we did, was we said, ‘Okay, if the page is 101K, 100K, then, you know, it’s reasonable to expect roughly one link per kilobyte, and therefore, something like 100 links on a page.’ So that was in our technical guidelines, and we said, you know, ‘This is what we recommend,’ and a lot of people assumed that if they had 102 links or something like that then we would view it as spam, and take action, but that was just kind of a rough guideline”
Matt Cutts, head of Google’s search spam team has put a new video discussing about Disavow Links tool. Should webmasters use the disavow tool, even if it is believed that no penalty has been applied?
Cutts said “the main purpose of the tools is for when you have done some “bad SEO” yourself or someone has on your behalf.”
The new Event Listener Tag can be used to tell Manager when you want to listen for events, and then write detailed rules for what to do when an event happens, which means you can have togs fire based on form submits, clicks and timers using a rule that looks for the corresponding event. You can make sure you’re getting the right form by using Auto-Event variable macros to narrow down your requests via attributes like the element ID and the form target.
Facebook has announced new mobile app as options to developers to help drive up engagement. The ads let developers make use of seven specific calls to action choices to include in their apps.
In its first phase of mobile app ads, Facebook offered one call to action: “Install now”. Now the company is moving into the second phase and has added seven more: “Open Link”, “Use App”, “Shop Now”, “Play Game”, “Book Now”, “Listen Now” and “Watch Video”, which are all designed to increase engagement with existing apps.
Can no-follow links hurt my site?
Matt Cutts released a new video in the Google webmaster help YouTube channel.
Cutts said that, No-follow links won’t harm your rankings in Google, but if a webmaster is spamming them on a massive scale, it could get the attention of the search Engine spam team and trigger a manual review.
In an announcement made, Microsoft and Local Corporation signed an agreement to provide location based inventory data for Bing.
As a part of the agreement, Local Corporation’s Krillion shopping data platform is helping power Bing’s local product search results, including relevant retail locations, brands, categories, and product availability data and details.
Matt Cutts, asked final night around twitter for webmasters and SEO’s to fill out a consult on a subject of tiny sites and Google rankings.
Matt Cutts asked “if there’s a website that we consider should be doing improved in Google, tell us some-more. The form creates it transparent that stuffing out this form will not impact a ranking of a submitted.”