Saturday 30 March 2013

[Build Backlinks Online] The Ultimate Guide to Setting up Google Experiments for Your Blog

Build Backlinks Online has posted a new item, 'The Ultimate Guide to Setting up
Google Experiments for Your Blog'

Ever want to try improving the conversion rate of your blog, but didn't have the
money to hire a designer, the time to figure it all out, or technical prowess to
pull it off? Well, Im happy to say those excuses are no longer valid ever since
the Google Experiments feature was integrated into Google Analytics. If youre
not familiar with Google Experiments Lab already, its a program that allows you
to serve different variations of your blog to different website visitors and
measure the performance of those page variations against your goals (more email
opt-ins, lower bounce rates) to see which version performs the best. Let me
explain in simpler terms with a hypothetical example on how to setup Google
Experiments.

You may view the latest post at
http://feedproxy.google.com/~r/BasicBlogTips/~3/6HUBofEyqrs/google-experiments-guide.html

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] Another March Mozscape Index is Live!

Build Backlinks Online has posted a new item, 'Another March Mozscape Index is
Live!'


Posted by carinoverturf

We're happy to announce the second Mozscape index for the month of March is now
live! Data has been refreshed across all SEOmoz applications -Open Site
Explorer, the Mozbar,PRO campaigns, and theMozscape API.

I know you're all thinking - but wait, you just launched an index last week?!
This was one of our fastest indexes to finish processing - taking only 11 days!
But processing overlapped slightly for this index and our previous March index.
However, the good news is that it just means lots of fresh data for you!

As you can see from the crawl histogram, a large volume of the data in this
index was crawled in the beginning of March and the oldest data dating back to
about the first week of February.



Here are the metrics for this latest index:


81,359,307,805 (81 billion) URLs

12,256,956,717 (12.2 billion) Subdomains

149,419,721 (149 million) Root Domains

774,927,201,776 (775 billion) Links

Followed vs. Nofollowed


2.18% of all links found were nofollowed

54.51% of nofollowed links are internal

45.49% are external



Rel Canonical - 15.80% of all pages now employ a rel=canonical tag

The average page has 75 links on it


64.04 internal links on average

11.03 external links on average




And the following correlations with Google's US search results:


Page Authority - 0.35

Domain Authority - 0.19

MozRank - 0.24

Linking Root Domains - 0.30

Total Links - 0.25

External Links - 0.29


We always love to hear your thoughts! And remember, if you're ever curious
about when Mozscape next updates, you can check the calendar here. We also
maintain a list of previous index updates with metrics here.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten
hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think
of it as your exclusive digest of stuff you don't have time to hunt down but
want to read!






You may view the latest post at
http://feedproxy.google.com/~r/seomoz/~3/gu-leR31y80/another-march-mozscape-index-is-live

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] How to Know if You’re Spread Too Thin in Social Media

Build Backlinks Online has posted a new item, 'How to Know if You're Spread
Too Thin in Social Media'

Amazing how far the pendulum has swung for corporate social media. In just three
years, weve moved from skepticism and suspicion of social media as an activity
worthy of support, to the present scenario with rampant social participation
proliferation. The most endangered word in social media today is no as social
media managers and governance [...]How to Know if Youre Spread Too Thin in
Social Media is a post from: Convince and Convert: Social Media Strategy and
Content Marketing Strategy


You may view the latest post at
http://www.convinceandconvert.com/social-media-strategy/how-to-know-if-youre-spread-too-thin-in-social-media/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-know-if-youre-spread-too-thin-in-social-media

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] Step-by-Step Guide to Qualifying Your Backlink Sources

Build Backlinks Online has posted a new item, 'Step-by-Step Guide to Qualifying
Your Backlink Sources'


As a general rule, a do-follow link is better than a no-follow link, as the
former passes on search value, while the later does not. Keep in mind, though,
that link equity isn't the only reason to build links. Say, for example, that
you were able to ...See all stories on this topic


Search Engine Journal



You may view the latest post at
http://www.google.com/url?sa=X&q=http://www.searchenginejournal.com/step-by-step-guide-to-qualifying-your-backlink-sources/59715/&ct=ga&cad=CAcQARgAIAAoATAAOABAkuzYigVIAlAAWABiBWVuLUdC&cd=bHLw3dSPFXw&usg=AFQjCNFdNVs06xyWdXFZUj5O1mQJmsUOqQ

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] 5 Strategies for Better 'Link Building' and Improving Your SEO

Build Backlinks Online has posted a new item, '5 Strategies for Better 'Link
Building' and Improving Your SEO'

Both have changed online marketing best practices, including everything from how
sites should be built to how "backlinks" should be created. Links that go from
another site to yours are called backlinks because they point back to your
pages. Building ...See all stories on this topic


You may view the latest post at
http://www.google.com/url?sa=X&q=http://news.terra.com/5-strategies-for-better-link-building-and-improving-your-seo,9b2cf5ac7bb2d310VgnCLD2000000ec6eb0aRCRD.html&ct=ga&cad=CAcQARgAIAAoATAAOABAkuzYigVIAlAAWABiBWVuLUdC&cd=bHLw3dSPFXw&usg=AFQjCNFO0V9npYunzVZdfAqZQkbZ3R1Hmw

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

Friday 29 March 2013

[Build Backlinks Online] Solving Klout’s “Warren Buffett Problem”

Build Backlinks Online has posted a new item, 'Solving Klout's "Warren
Buffett Problem"'

Matt Thomson, VP of Business Development at Klout,joins theSocial Pros
Podcastthis week to discuss the Warren Buffett problem his company faces, those
fancy analytics at work behind the Klout algorithms, and expanding to offer
brands a whole new product. Read on for some of the highlights and tweetable
moments, or listen to the full podcast. [...]Solving Klouts Warren Buffett
Problem is a post from: Convince and Convert: Social Media Strategy and Content
Marketing Strategy


You may view the latest post at
http://www.convinceandconvert.com/social-pros-podcast/solving-klouts-warren-buffett-problem/?utm_source=rss&utm_medium=rss&utm_campaign=solving-klouts-warren-buffett-problem

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] TITLE

Build Backlinks Online has posted a new item, 'The Mystery of Quality Score'

Mystery of Quality Score

Quality score can be confusing, there's no doubt about it. Many different factors can affect it, and it can be difficult to determine how to influence it. It's also in a state of constant evolution, so every time you think you have it down pat, think again. There's no need to pull your hair out though – I am going to show you what you need to focus on to start seeing more 10's in your account!

(More: Revisiting the Economics of Quality Score: Why QS Is Up to 200% More Valuable in 2013)

Google defines its Quality Score using the following components:

  • Past click-through rate (CTR) of a keyword
  • Past CTR of a display URL
  • Account history (overall CTR for ads and keywords)
  • Landing page quality
  • Relevance of keywords to ads
  • Relevance of keywords to search queries
  • Geographic performance
  • Performance of an ad on a particular website (this only applies to the Display Network)
  • Performance of an ad based on targeted device

Quality score can affect a number of elements of your account, including:

  • Ad auction eligibility
  • Actual cost per click (CPC) for a keyword
  • First page bid estimate for a keyword
  • Top of page bid estimate for a keyword
  • Ad position

Quality score is calculated every single time an ad is eligible to appear. This could mean thousands upon thousands of quality score calculations per day! Keep in mind that, when you check your campaign performance, the quality score you see for a campaign, ad group, keyword, or ad is actually an overall average. Also, the average quality score you see is only updated every 4-6 weeks or so. If you make changes to your account to make it stronger, Google is going to wait until it rewards you with a better quality score. Think of it as a "proof of concept" model.

While Google has not specifically defined how each of the components is weighted in determining quality score, there are ways of determining how great of an impact they can have. In my opinion, relevance in the absolute, most important factor of quality score. Relevance is the key strategy point for any paid search campaign, and quality score takes that heavily into account.

You need to ensure your keywords are relevant to your ad text, your ad text is relevant to the content of your landing page, and your keywords are relevant to what a user is searching for. You also want to make sure the keywords are highly relevant to the theme of your ad groups (noticing a pattern yet?) and that the theme of your ad groups is relevant to the theme of your campaigns. There is a method to this madness though. Google puts such an emphasis on relevance because it wants to be known as a well-respected, trustworthy, reliable source for finding information. Google wants you to be able to find exactly what you are looking for, at the exact moment you are looking for it.

Click-through rate is a really good indicator of relevance. If your ad is relevant to a searcher's query, they are more likely click on your ads, visit your website, and potentially convert.

Consider these examples: which of these ads would you be more likely to click on if you wanted to buy a Boston Bruins player jersey?

Quality Score Ads

It's pretty clear which of these ads makes is more relevant and compelling to a user searching for custom-made Boston Bruins jerseys. The ad on the left is extremely keyword-rich, with quality keywords in the headline, text lines, and even the display URL. With the ad on the right, a user would be uncertain what type of Boston Bruins apparel is being sold, and the ad doesn't engage the user with a strong call-to-action like the ad on the left.

You want to be as specific and relevant as you possibly can, and Google will reward you for it. Remember, the higher the quality score, the lower your CPC, first page bid estimate, and top of page bid estimate will be. We all want to have our ad appear in those precious, premium positions, and we want to pay as little as possible for them. Your ad rank (what position your ad appears in) is comprised of two things: your max CPC and your quality score. So while your competition may be bidding more than you, your ad can potentially appear in a better position thanks in part to a strong quality score. You'll pay less, and your ad will appear in a better position – how can you go wrong?

This post originated on the WordStream Blog. WordStream provides keyword tools for pay-per click (PPC) and search engine optimization (SEO) aiding in everything from keyword discovery to keyword grouping and organization.

You may view the latest post at http://feedproxy.google.com/~r/WordStreamBlog/~3/S-jucxF68zc/mystery-of-quality-score You received this e-mail because you asked to be notified when new updates are posted. Best regards, Build Backlinks Online peter.clarke@designed-for-success.com

Thursday 28 March 2013

[Build Backlinks Online] How to Live Tweet Like a Pro

Build Backlinks Online has posted a new item, 'How to Live Tweet Like a Pro'


Posted by RuthBurr

Those of you who follow me on Twitter have probably noticed that I live-tweet
the conferences I go to. Extensively. Some people love it, some people hate it -
but if you want to start live-tweeting for yourself, here are some things to
keep in mind.

Why I Live Tweet

I started live tweeting events a couple of years ago, when I realized that I
was spending as much time and effort tweeting out the most relevant points of
the session I was in as I spent taking notes plus, the notes I took were less
relevant than my tweets, since I was only tweeting out the best parts!

Once I committed to live tweeting conferences, I got a lot of great, positive
feedback about it from other attendees, so I kept on going. Ive also gotten the
bulk of my followers through live tweeting; it can be a great way to build your
personal brand at conferences and get increased visibility with attendees and
speakers alike.Live tweeting doesnt just build your brand among attendees of the
conference, either. People who are trying to follow along at home via the
conference hash tag are often even bigger fans of quality live tweets.

There's a noticeable uptick in people who read my name badge and say oh, youre
Ruth Burr! at the end of a conference compared to the beginning (when they
usually just say "nice to meet you").


@ruthburr Cheers for all the tweets, they are better than my notes, and much
neater ;)
Kingsland Linassi (@kingslandlinass) March 15, 2013



A big thanks for @ruthburr for live tweeting at #LinkLoveAppreciate it! :)
Dennis Seymour (@denseymour) March 15, 2013


So that's nice.

Why You Might Not Want to Live Tweet

A few caveats before we get in to the nitty-gritty of quality live Twitter
coverage:

You will lose followers. When Im covering a conference, Im tweeting multiple
times per minute, all day. That can really blow up someones Twitter feed. I
usually encourage my followers to mute me or the conference hash tag if they
dont want to be inundated, but some people just choose to unfollow and some of
those people dont re-follow after the conference is over.

Here are my daily follow/unfollow numbers from the last 60 days, courtesy of
Followerwonk:



As you can see, I get the most new followers on days Im live tweeting, but I
get the most unfollows on those days as well. With the 31 followers I lost
during SearchFest, my 54 new followers starts to look a lot more like 23. I'm
still at a net gain of followers, but if youre not prepared to (permanently)
lose some followers (especially those who aren't in the search industry), live
tweeting may not be for you.

It takes a ton of energy. Conferences can already be really draining, between
the late nights, the "always on" networking conversations and the stress of
trying to still get some work done while youre there. Live tweeting takes a
surprising amount of energy: the bulk of your focus needs to be on the session,
not on the session + your work email + your slides for later in the day +
Facebook. Tweeting live also means that even if a session is really boring or
not at all useful to you, you cant take a nice relaxing mental break and zone
out or work on something more important.

You're reporting the news, not making it. That's something that can get lost in
translation through retweets and replies. Youre going to get clarifying
questions and dissenting opinions about things you didnt even say (or
necessarily agree with). No matter how many times you say I didnt say it, Duane
Forrester did. Id suggest asking him if you need more information, some people
are still going to get hung up on the idea that youre the one advocating a
particular position. It can get sticky.

You'll probably get rate limited. I usually end up unable to tweet for at least
an hour per conference, because the Twitter API has blocked me for tweeting too
many times in too short a period.

So! Caveats firmly in place, let's talk about:

How to Provide Value via Live Tweets


Provide as much context as you can. Take this tweet from SearchFest:

Agility: Kinect was for games 1st, ppl hacked it, MSFT provided an SDK for
ppl to build what they want @melcarson #searchfest
Ruth Burr (@ruthburr) February 22, 2013


Just adding the word Agility to the beginning of the tweet puts the entire
factoid into the context in which Mel was using it. This increases the ability
for the tweet to be read and understood outside of the context of other
conference tweets. Which brings me to:

Think about the retweet. Each piece of information you tweet needs to be able
to survive on its own, independent of the tweets that preceded or followed it.
When you get retweeted, the new audience viewing that tweet may not have seen
your other tweets on the topic: make sure that tweet will make sense to them,
too.

Numbers are gold. When someone cites a statistic in their talk, tweeting the
specific numbers they mention really increases the relevance of your tweet.

Sites that regularly post content w/video have 200-300% more new visitors
and 2x time on page - key signs of relevance @thetoddhartley #SMX
Ruth Burr (@ruthburr) March 12, 2013



Dont try to live tweet anecdotes. Speakers will often use illustrative
examples in their talks, whether theyre passing anecdotes or full-on case
studies. These can be extremely hard to live tweet. Remember to stick to the
rules above. Its OK to sum up a two-minute anecdote or case study into one or
two tweets that are focused on the point.

Capture as many URLs as you can. If someone includes a link on a slide, Ill
usually type that out first and then write the tweet context around it, in case
they change the slide before I can write it down (this is especially important
with bit.ly links). Want to go above and beyond? If someone mentions a great
article but doesnt include the link, Google the piece and provide the link
yourself. That way youre adding extra value with your tweets.

Give shout-outs. Any time someone mentions a tool, tweet that out. If you know
that companys Twitter handle, include them with an @ mention. Do the same for
people. People love hearing about new tools to use, and businesses and
individuals alike love hearing they got a shout-out in a presentation. Doing
this also gets you on the radar of people who might not even be following the
conference.

Watch the conference hash tag. In addition to tweeting out the session youre
attending, keep an eye on the tweets coming out of other sessions. When you see
a juicy, highly-retweetable tweet come out, retweet it! Now youre providing
information on other sessions, too. Speaking of which:

Use the conference hash tag and speaker handles. I usually end each conference
tweet with the speakers twitter handle and the conference hash tag. It helps
mitigate the I dont make the news, I just report it factor I mentioned earlier,
plus its important to give credit to where credits due. Most of the time Ill
just copy the speaker handle and hash tag from my first tweet and then paste
them at the end of each tweet (be careful there arent any typos when you copy,
though I spent half of Marty Weintraubs MozCon session accidentally tweeting
him as @aimcear instead of @aimclear).


One tool Ill often use for live-tweeting conferences is TweetChat. It allows
you to track just the tweets coming from one hash tag, and will automatically
add the tag to the end of every tweet you post from the tool.

Other than that, I dont use many tools for live tweeting Im usually just using
the Twitter app for Mac. I use keyboard shortcuts for new tweet and post tweet
to save a bit of time.

The last thing youll really need to be able to live tweet a full conference is
the ability to type very fast, with few mistakes, and without looking at your
hands or, necessarily, the screen. I dont have any good recommendations for
tools/programs to use to learn to type faster; I learned to type really fast by
getting in a lot of arguments with people over instant messenger in high school
and college, so you could try that. If anybody has any suggestions for programs
to hone your typing skills, Id love to see them in the comments!

Happy live tweeting everybody!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten
hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think
of it as your exclusive digest of stuff you don't have time to hunt down but
want to read!






You may view the latest post at
http://feedproxy.google.com/~r/seomoz/~3/0sd5uOQY5YQ/how-to-live-tweet-like-a-pro

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] Real Talk About Why So Many Blogs Fail

Build Backlinks Online has posted a new item, 'Real Talk About Why So Many Blogs
Fail'

There are so many new blogs being born every day simply because blogging is one
of the easiest and most popular formats for creating content that we can use to
spread our message online. Blogging is also a popular strategy when it comes to
making money online. Many of the top bloggers make a substantial income from
their blogs in order to support themselves and their families. Others start
blogs because their friends are blogging and they want to be like their friends.
Some find enjoyment in blogging and set-up blogs because they enjoy writing and
blogging about things that they love.

You may view the latest post at
http://feedproxy.google.com/~r/BasicBlogTips/~3/Iec9k28jIts/why-many-blogs-fail.html

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] The Taylor Swift Guide to Creating Compelling Content

Build Backlinks Online has posted a new item, 'The Taylor Swift Guide to
Creating Compelling Content'

I am a huge Taylor Swift fan.Okay, granted, Im not a huge fan of a lot of her
music, but I am a fan of how she connects with her fans. Like many successful
rock stars, Taylor purposely hunts down her most passionate fans, and creates
amazing experiences for them. She endlessly shows her fans [...]The Taylor Swift
Guide to Creating Compelling Content is a post from: Convince and Convert:
Social Media Strategy and Content Marketing Strategy


You may view the latest post at
http://www.convinceandconvert.com/social-media-strategy/the-taylor-swift-guide-to-creating-compelling-content/?utm_source=rss&utm_medium=rss&utm_campaign=the-taylor-swift-guide-to-creating-compelling-content

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

Wednesday 27 March 2013

[Build Backlinks Online] Barnacle Reviews on Google+ Local

Build Backlinks Online has posted a new item, 'Barnacle Reviews on Google+
Local'


Posted by David Mihm

Since Google+ Local was released last May, its safe to say that everyone in the
local search community -- business owners and agencies alike -- has been waiting
with bated breath for the launch of Googles rumored Business Builder dashboard.
For whatever reason, it still isnt out yet, but while youre waiting, theres no
reason you cant take advantage of the most underrated feature of Google+: the
ability to interact on Google+ as a business page. And in particular, to leave
reviews of other businesses as your business page.

Why leave reviews as a page?

Business owners, if this concept doesnt immediately make sense to you, think of
it like this: you probably go to networking events with your local chamber of
commerce, Rotary club, or your industry trade group all the time. When you go to
these events, youre likely wearing your business owner hat, rather than your
"weekend warrior" or "soccer mom" hat.



Thats essentially what this feature allows you to do: network socially with
your business owner hat on, rather than your personal hat. Just like you would
refer business to other business owners you trust and admire in these networking
environments, the idea behind page-to-page recommendations on social networking
sites works the same way.

Facebook gave its page users this functionality years ago, and many of you are
likely accustomed to leaving comments on other Facebook pages and generally
interacting with their community as their page rather than an individual
profile. You may not have known, though, that you can do the same thing on
Google+.

Why "Barnacle" reviews?

As far as I know, Search Influence's Will Scott was the pioneer of this
conceptin local search, which he defined as:

"Attaching oneself to a large fixed object and waiting for the customers to
float by in the current."

As most of you would probably admit, it's hard work to optimize a local
business website/Plus page/etc. So why not leverage pages that arealready
visible in your marketsfor your own visibility? That's the idea behind Barnacle
SEO.

Will's original concept applied to link building to prominent Internet Yellow
Pages profiles like Yelp business pages or Yahoo Local listings to increase the
rankings of those profiles. As Facebook became more popular, he also applied the
idea to Facebook conversations on popular pages in a given community (such as
the home of your local newspaper or major/minor league sports team).

The problem is that with's Facebook's Timeline interface, comments and
conversations drop "below the fold" awfully quickly, especially on popular pages
with lots of conversations.

The results on Google+ Local pages, when done well, can yield much "stickier"
results.



Getting started: using Google+ as your page

This part is pretty easy. Simply go to http://plus.google.com and log in with
the Google Account under which you claimed your page. At the top righthand side,
you'll see a dropdown that shows the pages on which you're an admin. Simply
select the name of your page. Google will then take you to that page, and when
it does, you should see the icon of the page show up at the top righthand side
(rather than your personal profile photo).

You're now using Google+ as your business!

Getting your feet wet: reviewing friendly businesses


Going back to the Rotary club analogy, you probably already have a network of
existing businesses that you refer friends and clients to in the offline world
-- pay it forward and put your speech about why you would refer people to them
out there for the entire Internet to see.

Chances are, when they Google themselves, they'll see your business' review
right at the top of the list and might even leave YOU a review once they notice
it.

Here's an example of this in action with my friend Mike Ramsey's business.
You'll see, because he doesn't have that many reviews for his newspaper site, my
face-for-radio shows up publicly right at the top of his list.





Kicking it up a notch: finding popular businesses

OK, that was simple enough. But most of your friends aren't likely to run
tremendously popular businesses that are getting a lot of traffic from search,
let alone organic activity on Google+. You want to identify who the most popular
businesses are in your market. You probably have some idea of what they are
already, but here are some algorithmically-influenced ways to find them.

1) Perform a search for "things to do" in your market

Google is showing more and more of these carousel-style results for these
searches every day. The businesses and points of interest shown in this carousel
tend to be the ones that get the most visibility on Google+.



2) See what businesses Google recommends at maps.google.com

Visit http://maps.google.com and see who Google shows to the left of the map --
both in text and image format. Again, these are likely to be popular businesses
with lots of visibility on Google's local products.




3) See where top reviewers are going

Hat tip to my previously-mentioned friend Mike Ramsey of Nifty Marketing whose
team authored this excellent piece earlier this week about how to find top
reviewers on Google+ Local. Just follow the instructions in that post, and
you'll get a screen like this. Chances are, most of the places visited by top
reviewers are pretty popular.



4) See what places are popular on Foursquare

Visit foursquare.comand see what businesses are mentioned when you search for
"best nearby." These places are going to have a lot of visibility among
techies--good for a variety of reasons that I won't go into in this post.



Finishing things off: reviewing those businesses

So, the final step in the process is to leave a review of those top businesses.
I don't have any earth-shattering tips for best practices when it comes to
actually leaving a review, but I will point out that the more effort you put
into leaving a killer review, the more likely it is that effort will be
rewarded.

Why is that?Google+ sorts reviews by "Most Helpful" by default.This means that
the better your review is, the more likely it is to have staying power over time
-- which is the whole point of this exercise. You want people to gain real value
from your review and have a positive experience when they see your brand for the
first time.

Just like no one wants to talk to an incessant glad-hander or self-promoter at
a networking event, no one wants to read reviews that talk about how great their
own business is. Just imagine that you're talking to people face-to-face at one
of these events, except instead of a 1:1 interaction, it's more like a 1:100 or
a 1:1000 interaction.

Note that my business' review, though I left it over two weeks ago and haven't
asked anyone to mark it as helpful, is still ranking second out of all reviews.
Imagine the permanent "stickiness" of a review marked as helpful by even a
handful of Google+ users.





Conclusion

Obviously, this technique works best for retail- or hospitality industry
businesses, who are probably referring their guests to top attractions anyway,
and are most likely togettraffic from out-of-town guests in the process of
planning their trips.

But my guess is that (especially) in larger markets, even in-town residents are
likely to do "recovery" searches on popular destinations -- where Google is
increasingly pushing searchers towards Knowledge Graph results and popular
reviews from prominent Google+ users. Make sure your business (or your clients'
businesses) have a chance to gain this "barnacle" visibility.

In the comments, I'd love to hear if anyone has used this technique on their
own, or on behalf of their clients, and what the results have been!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten
hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think
of it as your exclusive digest of stuff you don't have time to hunt down but
want to read!






You may view the latest post at
http://feedproxy.google.com/~r/seomoz/~3/A05P8BtXvr8/barnacle-reviews-on-google-plus-local

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] Instagram Marketing for Online Entrepreneurs

Build Backlinks Online has posted a new item, 'Instagram Marketing for Online
Entrepreneurs'

Instagram was started in October 2010 by entrepreneurs Kevin Systrom and Michel
Krieger as a mobile photography project called Burbn, Instagram has surpassed
Flickr, Picasa and other well-known photo apps in terms of volume, importance,
and buzz-worthiness. Whats it going to take for Internet Marketers to finally
embrace Instagram and use its plug-ins to secure massive customer goodwill in
the years to come.

You may view the latest post at
http://feedproxy.google.com/~r/BasicBlogTips/~3/HGasudlgfbo/instagram-marketing.html

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] Google Reader’s Demise is Irrelevant for Blogs

Build Backlinks Online has posted a new item, 'Google Reader's Demise is
Irrelevant for Blogs'

In this edition of The Baer Facts, I talk with Kyle Lacy of ExactTarget about
the demise of Google Reader and how its overblown. Back From Australia Yes, were
returning to a weekly schedule of Baer Facts videos. I had to miss a couple
weeks while doing a series of speaking engagements in Sydney that [...]Google
Readers Demise is Irrelevant for Blogs is a post from: Convince and Convert:
Social Media Strategy and Content Marketing Strategy


You may view the latest post at
http://www.convinceandconvert.com/blogging-and-content-creation/google-reader-demise-is-irrelevant-for-blogs/?utm_source=rss&utm_medium=rss&utm_campaign=google-reader-demise-is-irrelevant-for-blogs

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

Tuesday 26 March 2013

[Build Backlinks Online] Busting Facebook’s Most Widespread Myth

Build Backlinks Online has posted a new item, 'Busting Facebook's Most
Widespread Myth'

For over a year, rumor has had it on the web that Facebook Page posts reach an
average of only 16% of fans. Actually, Facebook itself started this rumor in
February of 2012 at the Facebook Marketing Conference in New York. This famous
figure of 16% has been repeated on all social media blogsso that [...]Busting
Facebooks Most Widespread Myth is a post from: Convince and Convert: Social
Media Strategy and Content Marketing Strategy


You may view the latest post at
http://www.convinceandconvert.com/community-management-2/busting-facebooks-most-widespread-myth/?utm_source=rss&utm_medium=rss&utm_campaign=busting-facebooks-most-widespread-myth

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] SEO Finds In Your Server Log

Build Backlinks Online has posted a new item, 'SEO Finds In Your Server Log'


Posted by timresnik

I am a huge Portland Trail Blazers fan, and in the early 2000s, my favorite
player was Rasheed Wallace. He was a lightning-rod of a player, and fans either
loved or hated him. He led the league in technical fouls nearly every year he
was a Blazer; mostly because he never thought he committed any sort of foul.
Many of those said technicals came when the opposing player missed a free-throw
attempt and Sheed passionately screamed his mantra: BALL DONT LIE.

Sheed asserts that a basketball has metaphysical powers that acts as a system
of checks and balances for the integrity of the game. While this is debatable
(ok, probably not true), there is a parallel to technical SEO: marketers and
developers often commit SEO fouls when architecting a site or creating content,
but implicitly deny that anything is wrong.



As SEOs, we use all sorts of tools to glean insight into technical issues that
may be hurting us: web analytics, crawl diagnostics, and Google and Bing
Webmaster tools. All of these tools are useful, but there are undoubtedly holes
in the data. There is only one true record of how search engines, such as
Googlebot, process your website. These are web server logs. As I am sure Rasheed
Wallace would agree, logs are a powerful source of oft-underutilized data that
helps keep the integrity of your sites crawl by search engines in check.









A server log is a detailed record of every action performed by a particular
server. In the case of a web server, you can get a lot of useful information. In
fact, back in the day before free analytics (like Google Analytics) existed, it
was common to just parse and review your web logs with software like AWStats.



I initially planned on writing a single post on this subject, but as I got
going I realized that there was a lot of ground to cover. Instead, I will break
it into 2 parts, each highlighting different problems that can be found in your
web server logs:




This post: how to retrieve and parse a log file, and identifying problems
based on your servers response code (404, 302, 500, etc.).

The next post: identifying duplicate content, encouraging efficient crawling,
reviewing trends, and looking for patterns and a few bonus non-SEO related tips.


Step #1: Fetching a log file

Web server logs come in many different formats, and the retrieval method
depends on the type of server your site runs on. Apache and Microsoft IIS are
two of the most common. The examples in this post will based on an Apache log
file from SEOmoz.



If you work in a company with a Sys Admin, be really nice and ask him/her for
a log file with a days worth of data and the fields that are listed below. Id
recommend keeping the size of the file below 1 gig as the log file parser youre
using might choke up. If you have to generate the file on your own, the method
for doing so depends on how your site is hosted. Some hosting services store
them in your home directory in a folder called /logs and will drop a compressed
log file in that folder on a daily basis. Youll want to make sure to it includes
the following columns:





Host: you will use this to filter out internal traffic. In SEOmozs case,
RogerBot spends a lot of time crawling the site and needed to be removed for our
analysis.

Date: if you are analyzing multiple days this will allow you to analyze
search engine crawl rate trends by day.

Page/File: this will tell you which directory and file is being crawled and
can help pinpoint endemic issues in certain sections or with types of content.

Response code: knowing the response of the server -- the page loaded fine
(200), was not found (404), the server was down (503) -- provides invaluable
insight into inefficiencies that the crawlers may be running into.

Referrers: while this isnt necessarily useful for analyzing search bots, it
is very valuable for other traffic analysis.

User Agent: this field will tell you which search engine made the request
and without this field, a crawl analysis cannot be performed.



Apache log files by default are returned without User Agent or Referrer --
this is known as a common log file. You will need to request a combine log file.
Make your Sys Admins job a little easier (and maybe even impress) and request
the following format:



LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\""




For Apache 1.3 you just need combined CustomLog log/acces_log combined




For those who need to manually pull the logs, you will need to create a
directive in the httpd.conf file with one of the above. A lot more detail hereon
this subject.




Step #2: Parsing a log file

You probably now have a compressed log file like mylogfile.gz and its time
to start digging in. There are myriad software products, free and paid, to
analyze and/or parse log files. My main criteria for picking one includes: the
ability to view the raw data, the ability to filter prior to parsing, and the
ability to export to CSV. I landed on Web Log Explorer
(http://www.exacttrend.com/WebLogExplorer/) and it has worked for me for several
years. I will use it along with Excel for this demonstration. Ive used AWstats
for basic analysis, but found that it does not offer the level of control and
flexibility that I need. Im sure there are several more out there that will get
the job done.



The first step is to import your file into your parsing software. Most web
log parsers will accept various formats and have a simple wizard to guide you
through the import. With the first pass of the analysis, I like to see all the
data and do not apply any filters. At this point, you can do one of two things:
prep the data in the parse and export for analysis in Excel, or do the majority
of the analysis in the parser itself. I like doing the analysis in Excel in
order to create a model for trending (Ill get into this in the follow-up post).
If you want to do a quick analysis of your logs, using the parser software is a
good option.



Import Wizard: make sure to include the parameters in the URL string. As I
will demonstrate in later posts this will help us find problematic crawl paths
and potential sources for duplicate content.








You can choose to filter the data using some basic regexbefore it is
parsed. For example, if you only wanted to analyze traffic to a particular
section of your site you could do something like:








Once you have your data loaded into the log parser, export all spider
requests and include all response codes:








Once you have exported the file to CSV and opened in Excel, here are some
steps and examples to get the data ready for pivoting into analysis and action:




1. Page/File: in our analysis we will try to expose directories that could
be problematic so we want to isolate the directory from the file. The formula I
use to do this in Excel looks something like this.



Formula: <would like to put this is a textbox of some sort>


=IF(ISNUMBER(SEARCH("/",C29,2)),MID(C29,(SEARCH("/",C29)),(SEARCH("/",C29,(SEARCH("/",C29)+1)))-(SEARCH("/",C29))),"no
directory")




2. User Agent: in order to limit our analysis to the search engines we
care about, we need to search this field for specific bots. In this example, Im
including Googlebot, Googlebot-Images, BingBot, Yahoo, Yandex and Baidu.



Formula (yeah, its U-G-L-Y)



=IF(ISNUMBER(SEARCH("googlebot-image",H29)),"GoogleBot-Image",
IF(ISNUMBER(SEARCH("googlebot",H29)),"GoogleBot",IF(ISNUMBER(SEARCH("bing",H29)),"BingBot",IF(ISNUMBER(SEARCH("Yahoo",H29)),"Yahoo",
IF(ISNUMBER(SEARCH("yandex",H29)),"yandex",IF(ISNUMBER(SEARCH("baidu",H29)),"Baidu",
"other"))))))




Your log file is now ready for some analysis and should look something
like this:









Lets take a breather, shall we?



Step # 3: Uncover server and response code errors

The quickest way to suss out issues that search engines are having with
the crawl of your site is to look at the server response codes that are being
served. Too many 404s (page not found) can mean that precious crawl resources
are being wasted. Massive 302 redirects can point to link equity dead-ends in
your site architecture. While Google Webmaster Tools provides some information
on such errors, they do not provide a complete picture: LOGS DONT LIE.



The first step to the analysis is to generate a pivot table from your log
data. Our goal here is to isolate the spiders along with the response codes that
are being served. Select all of your data and go to Data>Pivot Table.



On the most basic level, lets see who is crawling SEOmoz on this
particular day:









There are no definitive conclusions that we can make from this data, but
there are a few things that should be noted for further analysis. First, BingBot
is crawling the site at about an 80% more clip. Why? Second, other bots account
for nearly half of the crawls. Did we miss something in our search of the User
Agent field? As for the latter, we can see from a quick glance that most of
which is accounting for other is RogerBot -- well exclude this.



Next, lets have a look at server codes for the engines that we care most
about.









Ive highlighted the areas that we will want to take a closer look.
Overall, the ratio of good to bad looks healthy, but since we live by the mantra
that every little bit helps lets try to figure out whats going on.



1. Why is Bing crawling the site at 2x that of Google? We should
investigate to see if Bing is crawling inefficiently and if there is anything we
can do to help them along or if Google is not crawling as deep as Bing and if
there is anything we can do to encourage a deeper crawl.



By isolating the pages that were successfully served (200s) to BingBot
the potential culprit is immediately apparent. Nearly 60,000 of 100,000 pages
that BingBot crawled successfully were user login redirects from a comment link.









The problem: SEOmoz is architected in such a way that if a comment
link is requested and JavaScript is not enabled it will serve a redirect (being
served as a 200 by the server) to an error page. With nearly 60% of Bings crawl
being wasted on such dead-ends, it is important that SEOmoz block the engines
from crawling.



The solution: add rel=nofollow to all comment and reply to comment
links. Typically, the ideal method for telling and engine not to crawl something
is a directive in the robots.txt file. Unfortunately, that wont work in this
scenario because the URL is being served via the JavaScript after the click.

GoogleBot is dealing with the comment links better than Bing and
avoiding them altogether. However, Google is crawling a handful of links
sucessfully that are login redirects. Take a quick look at the robots.txtand you
will see that this directory should probably be blocked.



2. The number of 302s being served to Google and Bing is acceptable,
but it doesnt hurt to review in case there are better ways for dealing with some
of edge cases. For the most part SEOmoz is using 302s for defunct blog category
architecture that redirects the user to the main blog page. They are also being
used for private message pages /message, and a robots.txt directive should
exclude these pages from being crawled at all.



3. Some of the most valuable data that you can get from your server
logs are links that are being crawled that resolve in a 404. SEOmoz has done a
good job managing these errors and does not have an alarming level of 404s. A
quick way to identify potential problems is to isolate 404s by directory. This
can be done by running a pivot table with Directory as your row label and count
of Directory in your value field. Youll get something like:









The problem: the main issue thats popping here is 90% of the 404s are
in one directory, /comments. Given the issues with BingBot and the JavaScript
driven redirect mentioned above this doesnt really come as a surprise.



The solution: the good news is that since we are already using
rel=nofollow on the comment links these 404s should also be taken care of.



Conclusion

Google and Bing Webmaster tools provide you information on crawl
errors, but in many cases they limit the data. As SEOs we should use every
source of data that is available and after all, there is only one source of data
that you can truly rely on: your own.



LOGS DONT LIE!



And for your viewing pleasure, here's a bonus clip for reading the
whole post.












Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten
hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think
of it as your exclusive digest of stuff you don't have time to hunt down but
want to read!






You may view the latest post at
http://feedproxy.google.com/~r/seomoz/~3/hS2lQJpw9hU/seo-finds-in-your-server-log

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com

[Build Backlinks Online] TITLE

Build Backlinks Online has posted a new item, 'The PPC Toolbox, Part 2: Competitive Research, Landing Page Creation, & Other Helpful Tools'

Yesterday, I shared a number of different PPC tools that will help with building and tracking performance in your account. This time around, I have a few new tools and sites that will help you check some performance statistics of your competition and see how you compare to them. I am also sharing a few tools that I have found myself needing quite a bit when managing my accounts and working with various clients.

PPC Competitor Analysis

SpyFu: SpyFu allows you to do in-depth research on competitors and see what keywords they are bidding on, as well as ad history, average position, and potential daily spend ranges. Whether you want to do in-depth research on a specific competitor or to see everyone that's bidding on your top keywords, SpyFu lets you do it all. It offers a free version which allows you to see the top 10 keywords from a competitor or stats on a specific keyword, but also has a paid option with even more features.

Spyfu PPC

Compete.com: This site allows you to see some basic analytic stats on your competition – monthly unique visitors, changes in monthly traffic, competitiveness, and more. It's very useful for gauging how your competition is growing and who you need to focus on. It offers a free version and a more in-depth version for a fee.

Compete PPC Tool

Alexa.com: Alexa is very similar to Compete, but offers even more stats, including some search analytics. It provides the top 10 search queries, daily search traffic over the course of last 30 days, and it even gives you overview stats, like how many keywords they are targeting in Google and Yahoo/Bing. Additional features include demographic info, contact info, and overall traffic stats. While there is a paid version of Alexa, the free version provides a significant amount of data, so you might not even need it.

Alexa PPC Tool

Tools for PPC Landing Page Creation

Unbounce: Unbounce provides multiple, easy-to-use templates for a variety of different landing page types, including lead generation and mobile apps. Create simple and beautiful landing pages with ease!

PPC Landing Page Tools

Shopify: Similar to Unbounce, but for e-commerce sites. Create great-looking store fronts and product pages!

Miscellaneous PPC Tools

These tools are especially useful if you manage multiple PPC clients in different time zones.

Area Code Lookup: It doesn't seem like something you'd need all that much, but when you're trying to figure out where the calls are coming from that were generated from the call extension, it can be a huge time saver!

Currency Converter: Trying to figure out what the US equivalent CPA is for a client in the UK? This handy tool will do it in seconds.

International Dialing Code: Even after calling clients around the world multiple times, I still look up what I need to dial using this easy tool.

Time Zone Converter: Not sure what time it is in Japan and you don't want to wake up your client? The Time Zone Converter quickly figures it out for you!

Thesauras.com: Can be very helpful for finding new keywords or even finding different words to try in your ad text!

Did I miss any great tools that you use in your PPC toolbox? If so, let me know if the comments!

This post originated on the WordStream Blog. WordStream provides keyword tools for pay-per click (PPC) and search engine optimization (SEO) aiding in everything from keyword discovery to keyword grouping and organization.

You may view the latest post at http://feedproxy.google.com/~r/WordStreamBlog/~3/9VtFcbTOccM/ppc-toolbox-part-2 You received this e-mail because you asked to be notified when new updates are posted. Best regards, Build Backlinks Online peter.clarke@designed-for-success.com

[Build Backlinks Online] TITLE

Build Backlinks Online has posted a new item, 'Revisiting the Economics of Quality Score: Why QS Is Up to 200% More Valuable in 2013'

Economics of Google AdWords Quality ScoreJust over 4 years ago, Craig Danuloff at Click Equations wrote an article called "The Economics of Quality Score" that provided a simple model to determine the value of Google Quality Score in your AdWords account.

We all know the value of Google's Quality Score in a high-level sense – Quality Score plays an important role in determining your Ad Rank, which is how Google determine the position in which your ad appears, which in turn determines the amount of exposure and clicks your ads will receive.

We also know that Quality Score plays a very important role in determining how much you're charged per click. A now famous video by Hal Varian, Google's Chief Economist, helped clarify this point – that your cost per click is calculated using the formula: Ad Rank of the ad below yours / your Quality Score.

Using this information, along with some basic arithmetic, Craig published two tables that illustrated the value of Quality Score at that time, as shown here:

quality score google adwords

The tables illustrate:

  1. The average savings and "penalties" experienced with high Quality Scores (8, 9, or 10) or low Quality Scores (6 or below), with 7 taken as a neutral average.
  2. The economic cost or benefit of having your Quality Score move up or down by 1 point.

At this point, it's important to note that the true Quality Score that Google uses to calculate your CPC isn't really a whole number between 1 and 10. The Quality Score that is visible to you in your account is a whole number between 1 and 10. Quality Score for calculating CPC is a real number and the scale is non-linear. Furthermore there are other factors involved in calculating CPC's that aren't disclosed by Google.

So while the actual numbers in the chart were off, the overall point hopefully remains true – that the positive and negative effects of Quality Score and its impact on CPC are directionally correct, and so these tables serve as a powerful illustration of the value of Quality Score.

What's the Value of Google Quality Score in 2013?

Much more than it was worth four years ago.

While the overall positive and negative effects of having a good or bad Quality Score remain true today, Quality Score has changed significantly in the last 4 years.

Most significantly, a Quality Score of 7 was set to be the neutral value at the time because QS=7 was the average Quality Score for most keywords back then. As Craig put it:

Note that we set QS=7 as the neutral value because using ClickEquations to review a wide range of accounts we've seen that QS=7 appears to be the mean quality score across a very large and diverse set of keywords.

Since then Quality Scores of 7 have become much scarcer and the average Quality Score has fallen.

Here's a quick snapshot of what the Impression Weighted Average Quality Score looks like in 2013. Note that I'm weighting Keyword Quality Scores by impressions – because different keywords have different search volumes.

I compiled this data by doing a manual analysis of several hundred new clients that WordStream signed up in the first two months of 2013.

Google Quality Score Impact

Based on this manual analysis, I estimate that today's impression weighted average Quality Score in 2013 is just slightly over 5 (out of a possible 10).

A Quality Score of 5 Is the New 7

Keywords, on an impression-weighted basis, now have an average Quality Score of 5. Therefore, accounts (or campaigns or ad groups) with average volume-weighted keyword Quality Scores better than 5 can be considered better than average, and are thereby benefiting relative to most advertisers. Accounts with average Quality Scores lower than 5 are below average, and those scores are detrimental to your account.

So re-running the calculations, but this time using a Quality Score of 5 as the new mean value for Quality Score (the "base value"), the Economics of Quality Score in 2013 now looks more like this:

quality score distribution

The #1 take-away here is that as average Quality Score has drifted lower, the "value" of having an above-average Quality Score has increased tremendously.

Also note that on a percentage change basis, the Quality Score "discounts" for above-average QS keywords have increased from 66% to 200% compared to their 2009 values, as illustrated in the preceding table.

For this and other reasons, I've often argued that Quality Score is the most important success metric in achieving AdWords success – and this is more true now than ever before.

Note that the marginal savings or cost increases for a 1-point increase or decrease in Quality Score remain unchanged (by definition), but are included in the table for the sake of completeness.

What's My Average Quality Score? How to Check Your Impression-Weighted Quality Score

Now that I've described the value of Quality Score, and how it's more important now than ever, you might be asking yourself – what's my Quality Score? There's two ways to figure it out – an easy way and a hard way. Let's start with the hard way.

Finding Your Average AdWords Quality Score the Hard Way

To manually figure out your average Quality Score, follow these steps:

  • Log into AdWords.
  • For each Search Campaign, navigate to the Keywords Tab, click on the "columns" button, then add the "Quality Score" attribute, as illustrated below.
  • Export the data into a spreadsheet and calculate your impression weighted average Quality Score by multiplying each keyword Quality Score by the number of impressions accrued to that keyword. Add up this product for all keywords, and then divide by the total number of impressions accrued by those keywords. That's your impression-weighted quality score for your Account (or campaign, ad group, etc.).

google quality score checker

Finding your Google Quality Score using the AdWords Interface

Finding Your AdWords Quality Score the Easy Way

Another way to quickly visualize your impression-weighted Quality Score distribution is to just grade your account using the AdWords Performance Grader. This free tool will do an instant audit of your PPC account across 8 different key performance metrics, including impression-weighted Quality Score.

The report will calculate and display your average Quality Score and plot a distribution of the number of impressions happening at each visible Quality Score for the last 90 days, and compare that to a "Recommended Curve" for your business. Here's an example of what the Quality Score section of the report looks like.

Check Google Quality Score

Now Is the Time to Improve Your Google Quality Score

Knowing that your Quality Scores may be saving you up to 50%, or costing you up to 400%, should provide strong motivation for everyone to both understand and work to improve your Quality Scores.

This post originated on the WordStream Blog. WordStream provides keyword tools for pay-per click (PPC) and search engine optimization (SEO) aiding in everything from keyword discovery to keyword grouping and organization.

You may view the latest post at http://feedproxy.google.com/~r/WordStreamBlog/~3/R4YBYY_jkyw/google-quality-score You received this e-mail because you asked to be notified when new updates are posted. Best regards, Build Backlinks Online peter.clarke@designed-for-success.com

Monday 25 March 2013

[Build Backlinks Online] TITLE

Build Backlinks Online has posted a new item, 'An Interview with Chiropractic Master Dr. Mike Reid'

In this insightful video, you’ll discover what it REALLY takes to become a chiropractic master this year and beyond. You’ll learn some of the top habits of the most successful doctors in practice. Lastly, you’ll find-out how to increase your patient volume and revenue by 25-50% (or more) in the next two months.

This video was filmed in the heart of Grace Bay on the beautiful island of Providenciales. Click the play button to watch this valuable training.

 

 

Just an FYI… the free practice-growth teleclass I refer to in the video can be found at the first link in the section below. It’s one of the most popular calls I’ve ever done and for good reason. On the audio training, you’ll get access to an ad that attracted a whopping 251 new patients.

See for yourself why so many chiropractors were raving!

Did you like this post? If so, click the Facebook "like" button below and share it with your friends.

Related Blog Posts:

[REPLAY] The 30 Day New Patient AVALANCHE!!

Fact or Fiction: Chiropractic Saves Lives?

Free Social Media Training for Chiropractors [video]

Chiropractor Fractures Wrists in Accident! Now What?

You may view the latest post at http://dcincome.com/blog/an-interview-with-chiropractic-master-dr-mike-reid/ You received this e-mail because you asked to be notified when new updates are posted. Best regards, Build Backlinks Online peter.clarke@designed-for-success.com

[Build Backlinks Online] TITLE

Build Backlinks Online has posted a new item, 'The PPC Toolbox, Part 1: Tracking, Bulk Editing, and Keyword Expansion'

Like any good carpenter, as a PPC manager you need to have a number of tools available to you in order to get the job done and done right. Through the years, I have come to find a number of different PPC tools to help in a variety of different areas, from competitor analysis to setting up conversion and analytics tracking. Here are some of my favorites!

PPC Tools for Setting Up Analytics and Conversion Tracking

Google Analytics URL Builder: One of the most time exhausting tasks is setting up Bing in order to properly track it in Analytics – you have to go in and add the special coding to the ad (or even keyword) destination URLs. Google offers a simple tool that allows you to fill in a few fields and then spits out a full URL with Analytics codes added! Just change the destination URL to the new URL and it will start tracking in Analytics in no time!

Google Analytics URL Builder

Tag Assistant by Google: One of the most frequent questions I get is "How do I know if my Analytics or AdWords conversion tracking is set up correctly?" Google must have received this question a number of times as well because they created a handy-dandy Chrome extension that quickly and easily tells you whether your Analytics and AdWords tracking is set up properly. It even tells you what is wrong if it's set up improperly. The only downside is that it doesn't check Bing conversion/analytics codes.

Google Tag Assistant

Bulk Editing Tools

AdWords Editor: Make bulk changes to your AdWords accounts offline. Change anything from bids to match types to even campaign settings. Everything is done offline so you don't have to worry about making any mistakes before posting.

Bing Ads Editor: Just like the AdWords Editor, but for Bing!

PPC Tools for Keyword Expansion and Research

WordStream: The best around! Build out ad groups using our database of over a BILLION keywords or find new keywords from your own search queries using our QueryStream tool.

PPC Keyword Research Tools

Jumbo Keyword: This tool allows you to copy your list of keywords into multiple match types quickly. Just add your list of keywords, click the type of match types you need created, and voila!

Jumbo Keywords

Google Trends: So your top keywords had a bit of a dip last week but you didn't make any changes to your account? Check out Google Trends to see if it's seasonality and affecting your competitors as well. Provides interest over time, regional interest, and even some related terms to help build out your keywords.

Google Trends Keyword Tool

Google Ad Preview Tool: Check to see if your ad is showing no matter where you are and what your targeting settings are! If you use it within your AdWords account (under Tools and Analysis), it will also tell you why it is or isn't showing and what keywords it's triggering.

Google AdWords Preview Tool

That's it for now! Stay tuned for my next installment which will go over tools for competitor analysis, landing page creation, and some other tools to help you with account and client management.

Did I miss anything? Let me know in the comments!

This post originated on the WordStream Blog. WordStream provides keyword tools for pay-per click (PPC) and search engine optimization (SEO) aiding in everything from keyword discovery to keyword grouping and organization.

You may view the latest post at http://feedproxy.google.com/~r/WordStreamBlog/~3/HXNtfJ6kkDM/ppc-toolbox-part-1 You received this e-mail because you asked to be notified when new updates are posted. Best regards, Build Backlinks Online peter.clarke@designed-for-success.com

Sunday 24 March 2013

[Build Backlinks Online] Back to the Future: Forecasting Your Organic Traffic

Build Backlinks Online has posted a new item, 'Back to the Future: Forecasting
Your Organic Traffic'


Posted by Dan Peskin
This post was originally in YouMoz, and was promoted to the main blog because it
provides great value and interest to our community. The author's views are
entirely his or her own and may not reflect the views of SEOmoz, Inc.

Great Scott! I am finally back again for another spectacularly lengthy post,
rich with wonderful titles, and this time - statistical goodness. It just so
happens, that in my past short-lived career, I was a Forecast Analyst (not this
kind). So today class, we will be learning about the importance of forecasting
organic traffic and how you can get started. Let's begin our journey.



Forecasting is Your Density. I Mean, Your Destiny


Why should I forecast? Besides the obvious answer - its f-ing cool to predict
the future, there are a number of benefits for both you and your company.

Forecasting adds value in both an agency and in-house setting. It provides a
more accurate way to set goals and plan for the future, which can be applied to
client projects, internal projects, or overall team/dept. strategy.

Forecasting creates accountability for your team. It allows you to continually
set goals based on projections and monitor performance through forecast accuracy
(Keep in mind that exceeding goals is not necessarily a good thing, which is why
forecast accuracy is important. We will discuss this more later).

Forecasting teaches you about inefficiencies in your team, process, and
strategy. The more you segment your forecast, the deeper you can dive into
finding the root of the inaccuracies in your projections. And the more granular
you get, the more accurate your forecast, so you will see that segmentation is a
function of accuracy (assuming you continually work to improve it).

Forecasting is money. This is the most important concept of forecasting, and
probably the point in where you decided that you will read the rest of this
article.

The fact that you can improve inefficiencies in your process and strategy
through forecasting, means you can effectively increase ROI. Every hour and
resource allocated to a strategy that doesnt deliver results can be reallocated
to something that proves to be a more stable source of increased organic
traffic. So finding out what strategies consistently deliver the results you
expect, means youre investing money into resources that have a higher
probability of delivering you a larger ROI.

Furthermore, providing accurate projections, whether its to a CFO, manager, or
client, gives the reviewer a more compelling reason to invest in the work that
backs the forecast. Basically, if you want a bigger budget to work with,
forecast the potential outcome of that bigger budget and sell it. Sell it well.

Okay. Flux Capacitor, Fluxing. Forecast, Forecasting?




I am going to make the assumption that everyones DeLorean is in the shop, so
how do we forecast our organic traffic?

There are four main factors to account for in an organic traffic forecast:
historical trends, growth, seasonality, and events. Historical data is always
the best place to start and create your forecast. You will want to have as many
historical data points as possible, but the accuracy of the data should come
first.

Determining the Accuracy of the Data


Once you have your historical data set, start analyzing it for outliers. An
outlier to a forecast is what Biff is to George McFly, something you need to
punch in the face and then make wash your car 20 years in the future. Well
something like that.

The quick way to find outliers is to simply graph your data and look for spikes
in the graph. Each spike is associated with a data point, which is your outlier,
whether it spikes up or down. This way does leave room for error, as the
determination of outliers is based on your judgement and not statistical
significance.

The long way is much more fun and requires a bit of math. I'll provide some
formula refreshers along the way.

Calculating the mean and the standard deviation of your historical data is the
first step.

Mean



Standard Deviation




Looking at the standard deviation can immediately tell you whether you have
outliers or not. The standard deviation tells you how close your data falls near
the average or mean, so the lower the standard deviation, the closer the data
points are to each other.

You can go a step further and set a rule by calculating the coefficient of
variation (COV). As a general rule, if your COV is less than 1, the variance in
your data is low and there is a good probability that you dont need to adjust
any data points.

Coefficient of Variation (COV)



If all the signs point to you having significant outliers, you will now need to
determine which data points those are. A simple way to do this is calculate how
many standard deviations away from the mean your data point is.

Unfortunately, there is no clear cut rule to qualify an outlier with deviations
from the mean. This is due to the fact that every data set is distributed
differently. However, I would suggest starting with any data point that is more
than one deviation from the mean.

Making your decision about whether outliers exist takes time and practice.
These general rules of thumb can help you figure it out, but it really relies on
your ability to interpret the data and be able to understand how each data point
affects your forecast. You have the inside knowledge about your website, your
equations and graphs dont. So put that to use and start making your adjustments
to your data accordingly.

Adjusting Outliers


Ask yourself one question: Should we account for this spike? Having spikes or
outliers is normal, whether you need to do anything about it is what you should
be asking yourself now. You want to use that inside knowledge of yours to
determine why the spike occurred, whether it will happen again, and ultimately
whether it should accounted for in your future forecast.



In the case that you dont want to account for an outlier, you will need to
accurately adjust it down or up to the number it would have been without the
event that caused the anomaly.

For example, lets say you launched a super original infographic about the
Olympics in July last year that brought your site an additional 2,000 visits
that month. You may not want to account for this as it will not be a recurring
event or maybe it fails to bring qualified organic traffic to the site (if the
infographic traffic doesnt convert, then your revenue forecast will be
inaccurate). So the resulting action would be to adjust the July data point down
2,000 visits.

On the flipside, what if your retail electronics website has a huge positive
spike in November due to Black Friday? You should expect that rise in traffic to
continue this November and account for it in your forecast. The resulting action
here is to simply leave the outlier alone and let the forecast do its business
(This is also an example of seasonality which I will talk about more later).

Base Forecast


When creating your forecast, you want to create a base for it before you start
incorporating additional factors into it. The base forecast is usually a flat
forecast or a line straight down the middle of your charted data. In terms of
numbers, this can be simply be using the mean for every data point. The line
down the middle of the data follows the trend of the graph, so this would be the
equivalent of the average but accounting for slope too. Excel provides a formula
which actually does this for you:=FORECAST(x, known_y's,known_x's)
Given the historical data, excel will output a forecast based on that data and
the slope from the starting point to end point. Dependent on your data, your
base forecast could be where you stop, or where you begin developing an accurate
forecast.

Now how do you improve your forecast? Its a simple idea - account for anything
and everything the data might not be able to account for. Now you dont need to
go overboard here. I would draw the line well before you start forecasting the
decrease in productivity on Fridays due to beer o clock. I suggest accounting
for three key factors and accounting for them well; growth, seasonality, and
events.

Growth


You have to have growth. If you arent planning to grow anytime soon, then this
is going to be a really depressing forecast. Including growth can be as simple
as adding 5% month over month, due to a higher level estimate from management,
or as detailed as estimating incremental search traffic by keyword from
significant ranking increases. Either way, the important part is being able to
back your estimates with good data and know where to look for it. With organic
traffic, growth can come from a number of sources but these are a couple key
components to consider:

Are you launching new products?



New products means new pages, and dependent on your domain's authority and your
internal linking structure, you can see an influx of organic traffic. If you
have analyzed the performance of newly launched pages, you should be able to
estimate on average what percentage of search traffic from relevant and target
keywords they can bring over time.

Using Google Webmaster Tools CTR data and the Adwords Tool for search volume
are your best bet to acquire the data you need to estimate this. You can then
apply this estimate to search volumes for the keywords that are relevant to each
new product page and determine the additional growth in organic traffic that new
product lines will bring.

Tip: Make sure to consider your link building strategies when analyzing past
product page data. If you built links to these pages over the analyzed time
period, then you should plan on doing the same for the new product pages.

What ongoing SEO efforts are increasing?

Did you get a link building budget increase? Are you retargeting several key
pages on your website? These things can easily be factored in, as long as you
have consistent data to back it up. Consistency in strategy is truly an asset,
especially in the SEO world. With the frequency of algorithm updates, people
tend to shift strategies fairly quickly. However, if you are consistent, you can
quantify the results of your strategy and use it improve your strategy and
understand its effects on the applied domain.

The general idea here is that if you know historically the effect of certain
actions on a domain, then you can predict how relative changes to the domain
will affect the future (given there are no drastic algorithm updates).

Let's take a simple example. Let's say you build 10 links to a domain per month
and the average Page Authority is 30 and Domain Authority is 50 for the targeted
pages and domain when you started. Over time you see as a result, your organic
traffic increase by 20% for the pages you targeted on this campaign. So if your
budget increases and allows you to apply the same campaign to other pages on the
website, you can estimate an increase in organic traffic of 20% to those pages.
This example assumes the new target pages have:


Target keywords with similar search volumes

Similar authority at prior to the campaign start

Similar existing traffic and ranking metrics

Similar competition


While this may be a lot to assume, this is for the purpose of the example.
However, these are things that will need to be considered and these are the
types of campaigns that should be invested in from a SEO standpoint. When you
find a strategy that works, repeat it and control the factors as much as
possible. This will provide for an outcome that is the least likely to diverge
from expected results.

Seasonality


To incorporate seasonality into a organic traffic forecast, you will need to
create seasonal indices for each month of the year. A seasonal index is an index
of how that month's expected value relates to the average expected value. So in
this case, it would be how each month's organic traffic compares with average or
mean monthly organic traffic.

So let's say your average organic traffic is 100,000 visitors per month and
your adjusted traffic for last November was 150,000 visitors, then your index
for November is 1.5. In your forecast you simply multiply by this weight for the
corresponding index month.

To calculate these seasonal indices, you need data of course. Using adjusted
historical data is the best solution, if you know that it reflects the
seasonality of the website's traffic well.

Remember all that seasonal search volume data the Adwords tool provides? That
can actually be put to practical use! So if you haven't already, you should
probably get with the times and download the Adwords API excel plugin from
SEOgadget (if you have API access). This can make gathering seasonal data for a
large set of keywords quick and easy.

What you can do here, is gather data for all the keywords that drive your
organic traffic, aggregate it, and see if the trends in search align with the
seasonality you are observing in your adjusted historical data. If there is a
major discrepancy between the two, you may need to dig deeper into why or shy
away from accounting for it in your forecast.

Events


This one should be straightforward. If you have big events coming up, find a
way to estimate their impact on your organic traffic. Events can be anything
from a yearly sale, to a big piece of content being pushed out, or a planned
feature on a big media site.

All you have to do here is determine the expected increase in traffic from each
event you have planned. This all goes back to digging into your historical data.
What typically happens when you have a sale? What's the change in traffic when
you launch a huge content piece? If you can get an estimate of this, just add it
to the corresponding month when the event will take place.
Once you have this covered, you should have the last piece to a good looking
forecast. Now it's time to put it to the test.

Forecast Accuracy


So you have looked into your crystal ball and finally made your predictions,
but what do you do now? Well the process of forecasting is a cycle and you now
need to measure the accuracy of your predictions. Once you have the actuals to
compare to your forecast, you can measure your forecast accuracy and use this to
determine whether your current forecasting model is working.

There is a basic formula you can use to compare your forecast to your actual
results, which is the mean absolute percent error (MAPE):

This formula requires you to calculate the mean of the absolute percent error
for each time period, giving you your forecast accuracy for the total given
forecast period.
Additionally, you will want to analyze your forecast accuracy for just a single
period if your forecast accuracy is low. Looking at the percent error month to
month will allow you to pin point where the largest error in your forecast is
and help you determine the root of the problem.

Keep in mind that accuracy is crucial if organic traffic is a powerful source
of product revenue for your business. This is where exceeding expectations can
be a bad thing. If you exceed forecast, this can result in stock outs on
products and a loss in potential revenue.

Consider the typical online consumer, do you think they will wait to purchase
your product on your site if they can find it somewhere else? Online shoppers
want immediate results, so making sure you can fulfil their order makes for
better customer service and less bounces on product pages (which can affect rank
as we know).




Top result for this query is out of stock, which will not help maintain that
position in the long term.

Now this doesn't mean you should over forecast. There is a price to pay on both
ends of the spectrum. Inflating your forecast means you could be bringing in
excess inventory as it ties to product expectations. This can bring in
unnecessary inventory expenses such as increased storage costs and tie up cash
flow until the excess product is shipped. And dependent on product life cycles,
continuing this practice can lead to an abundance of obsolete product and huge
financial problems.

So once you have measured your forecast to actuals and considered the above,
you can repeat the process more accurately and refine your forecast! Well this
concludes our crash course in forecasting and how to apply it to organic
traffic. So what are you waiting for? Start forecasting!

Oh and here is a little treat to get you started.

Are you telling me you built a time machine...in Excel?


Well no, Excel can't help you time travel, but it can help you forecast. The
way I see it, if you're gonna build a forecast in Excel, why not do it in style?

I decided that your brain has probably gone to mush by now, so I am going to
help you on your way to forecasting until the end of days. I am providing a
stylish little excel template that has several features, but I warn you it
doesn't do all the work.

It's nothing to spectacular, but this template will put you on your way to
analyzing your historical data and building your forecast. Forecasting isn't an
exact science, so naturally you need to do some work and make the call on what
needs to be added or subtracted to the data.

What this excel template provides:


The ability to plug in the last two years of monthly organic traffic data and
see a number of statistical calculations that will allow you to quickly analyze
your historical data.

Provides you with the frequency distribution of your data.

Highlights the data points that are more than a standard deviation from the
mean.

Provides you with some metrics we discussed (mean, growth rate, standard
deviation, etc).


Oh wait there's more?




Yes. Yes. Yes. This simple tool will graph your historical and forecast data,
provide you with a base forecast, and a place to easily add anything you need to
account for in the forecast. Lastly, for those who don't have revenue data tied
to Analytics, it provides you with a place to add your AOV and Average
Conversion Rate to estimate future organic revenue as well. Now go have some fun
with it.
________________________________________________________________________________________
Obviously we can't cover everything you need to know about forecasting in a
single blog post. That goes both from a strategic and mathematical standpoint.
So let me know what you think, what I missed, or if there are any points or
tools that you think are applicable for the typical marketer to add to their
skillset and spend some time learning.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten
hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think
of it as your exclusive digest of stuff you don't have time to hunt down but
want to read!






You may view the latest post at
http://feedproxy.google.com/~r/seomoz/~3/pKZw60RSjy4/back-to-the-future-forecasting-your-organic-traffic

You received this e-mail because you asked to be notified when new updates are
posted.
Best regards,
Build Backlinks Online
peter.clarke@designed-for-success.com