Build an Online Portfolio and Drive Traffic to it

There are many times when getting the expected search engine optimization (SEO) results can be difficult. And even though you must have tested and tried several SEO techniques, one trick which is going to get you better results is building a great online portfolio.

Of late online portfolio or digital self-promotion has come up as an effective search engine optimizationtool. So how exactly can you come up with an engaging online portfolio and also drive traffic to it? Here are some very easy steps to build an online portfolio design which will help you get better traffic as well as better sales:

SEO Services

1.   Conduct a thorough Research

Before you go on to developing an online portfolio all over, it will be of great help to conduct an intelligent and thorough research for on the areas which require improvements. For eg. It will be helpful to determine the right keywords which will drive in more traffic on your website.

2.   Stay Simple and Realistic

Simplicity is the key to good SEO results. At the same time you must not confuse simplicity with being insubstantial. Once when you have decided to build an online portfolio you need to incorporate elements to it in such a manner that the end-user can understand it without any hassles. Also it is very important that your approach to building the portfolio is a realistic one. Whether it is adding specific elements to your portfolio design or choosing the right keyword always stay real even though initially the results might not be as great as you expect it to be.

3.   Include Titles and Descriptions

If you want search engines to crawl your online portfolio in a better manner, then you need to include titles as well as descriptions with your work. For eg. Add client name and the type of work in the title tag of your portfolio. Also along with this add descriptions which are keyword rich. This way, not only will you promote your work well, but you will also get better SEO rankings.

4.   Come up with a Blog and Build Strong Inbound Links

Content marketing can in fact act as a great catalyst for your online portfolio. And one of the best ways to do this is by coming up with regular blog posts. See to it that your blogs are on topics which are informative and useful to the end user.

Along with this building inbound links will go a long way in making your portfolio popular. It is recommended that you always check with the latest Google guidelines to build effective inbound links.

5.    Use a valid code and Make your work Shareable

One of the best ways to get your online portfolio get better traffic is by using a valid code. You can get all the information regarding this from W3C (World Wide Web Consortium).

Most important of all, make your portfolio shareable on various social media platforms. This way, visitors can share it in their groups and communities and make your work go viral.

These tips will go a long way in helping you build a great portfolio. You can also hire professional SEO services to get a dynamic online portfolio. A good portfolio will surely help you boost your SEO results in a cost-effective manner.


Twitter Launches Broad Match For Keywords

Twitter announced a new feature that it calls “broad match for keyword targeting” which allow advertisers running keyword targeted campaigns to reach users who are using synonyms alternate spellings, or “Twitter Lingo.”

For instance, if the coffee shop sells lattes but not espressos, it can use the “+” modifier on the broad matched terms to prevent broadening and targeting the wrong users. Targeting “love + latte” will match to users who tweet “luv latte,” but not those who tweet “luv espresso.”


Twitter says broad match is now available through and its advertiser API. Furthermore, broad match will be the default matching type for targeted keywords moving forward. Existing campaigns will remain unchanged and will be automatically opted into the “+” modifier to prevent broadening.

None of this is new, yet on social networks like Twitter, features are of utmost importance as users interact with the service in unique ways after all they only have 140 characters to work with.

How To Recover From Google Penalty That You Have Incurred As The Result Of Spammy Links

Google has provided few suggestions in a new webmaster help video regarding how to recover from Google penalty due to spammy links.

Matt Cutts answered to the question

“How did Interflora turn their ban in 11 days? Can you explain what kind of penalty they had, how they fixed it, as some of us have spent months trying to clean things up after an unclear GWT notification?”

Interflora is a major UK flower site that was hit by a Google penalty early this year, but Google didn’t call out this company publicly, after the reports of penalty came out, the company wrote a blog post warning people not to engage in the “buying and selling of links.”

Matt Cutts proceeds to try and answer the question in more general terms “Google tends to looking at buying and selling links that pass page rank as a violation of our guidelines and if we see that happening multiple times then the action that we take more and more severe. We’re more willing to take stronger action whenever we see repeat violation.”

Cutts says “It’s not something that I would typically recommend for everybody to disavow every link that you’ve gotten for a period of years-but certainly when people start over with completely new websites they have bought-we have seen a few cases where people will disavow every single link because they truly want to get a fresh start.”

In other words, if you’re willing to go to such great lengths and eliminate such a big number of links, Google is going to notice.

If you’ve got links from some very spammy forum or something like that, rather than trying to identify the individual pages, that might be the opportunity to da a ‘domain:’ . So if you’ve got a lot of links that you think are bad from a particular site, just go ahead and do ‘domain:’ and the name of that domain.  Don’t try to pick the individual links because you might be missing a lot

How many links you should have on a page-Matt Cutts

Matt Cutts posted a video explaining why Google no longer has that 100 links per page webmaster guideline.

Cutts says “It used to be the case that Google bot and our indexing system would truncate at 100 or 101K, and anything beyond that wouldn’t get indexed, and what we did, was we said, ‘Okay, if the page is 101K, 100K, then, you know, it’s reasonable to expect roughly one link per kilobyte, and therefore, something like 100 links on a page.’ So that was in our technical guidelines, and we said, you know, ‘This is what we recommend,’ and a lot of people assumed that if they had 102 links or something like that then we would view it as spam, and take action, but that was just kind of a rough guideline”

Matt also explained that your PageRank is divided by the number of links on a page. If you have 100 links, you’ll divide your PageRank by 100. If you have 1,000 links, you’ll divide your PageRank by 1,000.

If you’re concerned about having too many links on a page, Cutts suggests getting a “regular user,” and testing it out with them to see if they think it has too many links.

Matt Cutts Talks Negative SEO and Disavow Links Tools

Matt Cutts, head of Google’s search spam team has put a new video discussing about Disavow Links tool. Should webmasters use the disavow tool, even if it is believed that no penalty has been applied?

Cutts said “the main purpose of the tools is for when you have done some “bad SEO” yourself or someone has on your behalf.”

If you have done the work to keep an active look on your backlinks, and you see something strange going on, you don’t have to wait around. Feel free to just preemptively say ‘This is a weird domain. I have nothing to do with it, I don’t know what this particular bot is doing in terms of making links’. Just feel free to go ahead and do disavow, even on a domain level.

Auto Event Tracking with Google Tag manager-No more JavaScript and HTML code.

Google announced a new version of Google Tag Manager with Auto-Event Tracking. You can now write rules instead of JavaScript and HTML.

Auto Event Tracking lets you track almost any user action without any additional JavaScript. It automatically captures user actions like clicks and form submissions.

The new Event Listener Tag can be used to tell Manager when you want to listen for events, and then write detailed rules for what to do when an event happens, which means you can have togs fire based on form submits, clicks and timers using a rule that looks for the corresponding event. You can make sure you’re getting the right form by using Auto-Event variable macros to narrow down your requests via attributes like the element ID and the form target.


There are four different types of user actions that the tag can detect. Again, each action results in a Google Tag Manager event.

  • Click Listener: This tag listens for a click on a page. This includes button clicks, link clicks, image clicks etc. When a click occurs, the Google Tag Manager event is automatically generated.
  • Form Listener: This tag will listen for any form submissions. When a for submissions accurs the Google Tag Manager event gtm.formsubmit is automatically generated.
  • Link Click Listener: Same as the click listener, except it only captures clicks on links. When a link is clicked, the Google Tag Manager event gtm.linkclick is automatically generated.
  • Timer Listener: This will collect data at some regular interval that you specify. For example, if you specify an interval of 10,000 milliseconds, GTM will fire an event every 10 seconds.

If you want to automatically listen for user actions you must include one of the above tags on the page where you would like to capture the user action.

To conclude: Is it worth using the Google Tag Manager?  Of course Yes, but Google Tag Manager is a bit complicated because it is a general tool that should work with any of your tracking code. Google analytics is very important and helpful for Search Engine Optimization or SEO, it helps you to understand the user’s experience after visiting your site.


Facebook Tweaks Mobile App Ads

Facebook has announced new mobile app as options to developers to help drive up engagement. The ads let developers make use of seven specific calls to action choices to include in their apps.

In its first phase of mobile app ads, Facebook offered one call to action: “Install now”. Now the company is moving into the second phase and has added seven more: “Open Link”, “Use App”, “Shop Now”, “Play Game”, “Book Now”, “Listen Now” and “Watch Video”, which are all designed to increase engagement with existing apps.


Facebook launched its in-app ads last September, and they have driven more than 145 million installs from Apple’s App store and Google Play, according to Facebook. The change in calls to action was partially influenced by a study conducted by Localytics, which found that 66% of app users only open apps between one and 10 times.

The idea of re-engaging users is a familiar one in Online Marketing. Now the Facebook can take the form of targeting users who have already installed a mobile App and then presenting them with custom calls to action.  A Facebook spokesperson noted that the new calls to action should make the ads more tailorable to apps beyond gaming. They also said this won’t change the way the company charges for the ads.

For Facebook itself, mobile app install ads have helped drive its rapid rise in mobile ad sales, which accounted for 41% of total ad revenue in the second quarter. While not providing specific numbers Facebook CEO Mark Zuckerburg said in the July earning call that revenue from the app ads continued to accelerate.


Matt Cutts:Reasonable nofollow links won’t damage your search rankings in Google

Can no-follow links hurt my site?

Matt Cutts released a new video in the Google webmaster help YouTube channel.

Cutts said that, No-follow links won’t harm your rankings in Google, but if a webmaster is spamming them on a massive scale, it could get the attention of the search Engine spam team and trigger a manual review.

Things like Blog comments tend to have a very particular footprint that would be easy to spot. Cutts mentioned a specific example of where blog comments where problematic, even though they were already no-followed, because it was done on such a massive scale.

SEO Experts have known for quite some time that putting no-follow link is a great way to tell Google that you don’t want to pass PageRank for this particular link or that you somehow don’t trust the sites link is on. In addition, giving no follow links any negative attention at all is only going to encourage people who participate in negative SEO attacks on their competitors.

Bing’s Local Product Inventory Data in Search Results

In an announcement made, Microsoft and Local Corporation signed an agreement to provide location based inventory data for Bing.

As a part of the agreement, Local Corporation’s Krillion shopping data platform is helping power Bing’s local product search results, including relevant retail locations, brands, categories, and product availability data and details.


According to the study conducted by Local Corporation and the e-tailing group, consumers are increasing relying on non-store channels such as search for researching prior to purchase. The study also reports that 90 percent of shopping still involves a trip to the store and almost half of consumers reported spending 50 percent or more of their shopping time researching products online.

Bing announced in August changes to its product search feature that now indexes tens of millions of individual products. The technology uses advanced machine learning technology and intent signals to serve product pricing and availability directly from the merchant in search results. New product ads with photos and pricing provide the ability to quickly see offers from merchants across the web. The feature replaces Bing Shopping.

It’s not clear what percentage of product queries are inventory related. However most products are purchased in local stores.


Matt Cutts wants to know: Why your small website not ranking well in Google?

Matt Cutts, asked final night around twitter for webmasters and SEO’s to fill out a consult on a subject of tiny sites and Google rankings.


Matt Cutts asked “if there’s a website that we consider should be doing improved in Google, tell us some-more. The form creates it transparent that stuffing out this form will not impact a ranking of a submitted.”

The form collects dual pieces of information:

  • The name and URL of a tiny site we consider should arrange well.
  • Why do we consider that small site should arrange better?

So if we have a tiny site and consider it should arrange better, make certain to tell Google about it.