Frequenty Asked Questions

These FAQs are always expanding and gaining topics. We encourage you to ask an SEO question if it isnt already included. Just visit submit an SEO FAQ and we’ll respond to you and if it seems relevant to many we’ll add it here.

Mobile SEO (3)

Let me show you how important it is….

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the left?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

Zumper.com Passes Speed

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

Hits: 9

Content Delivery Networks are used to speed up sites by removing the bottleneck created by having one server provide files. These added servers are then spread out over a geographical area bringing the files closer to end users. This network could be global or regional.

Cloudflare is an example of a CDN as well as Jetpack. A popular paid CDN is AWS’s Cloudfront. These servers are called Edge servers as they are on the edge of the companies network all over.

Hits: 10

Volume and because Google says it is.  That’s honestly the best answer…now for the why….

  • More than half of the searches on Google are from mobile devices.  If you only think in terms of searching from a desktop you ignore the majority of searches.  A mobile device usually displays content in a small limited width view, desktop pages are not appropriate for a good mobile experience.  Mobile’s load generally slower as they lack a physical connection to a network.  Pages need to load fast and thus need to be streamlined for the best experience, page load speed is therefore extremely important.
  • Mobile First – Is an initiative Google started in 2018.  The name is self explanatory.  If you don’t want to put mobile first you’ll ignore the greatest customer you have….Google.
  • Mobile Index – to address the disparity between the mobile experience and a desktop experience Google created a second index of results thats independent of the desktop results.  You could be the number one result in desktop but not listed in mobile.  Since we noted more than half of searches are on mobile devices you would lose over half the opportunities for reaching a visitor.

Ignore mobile devices at your own peril.  Your SEO efforts are doomed in my humble opinion.  I don’t work on projects I dont think can succeed so I’d bow out if working on a project ignoring mobile.

Here is an example …. searching “rental homes in louisville” you’ll notice that the results come in a different order.

desktop vs mobile search results

It appears while realtor.com is considered better for desktops, zumper.com has a better mobile experience.  Its likely better laid out and faster than realtor.com so it beats that site in mobile.  Who wins here?  Zumper as more searches on mobile than desktop.  You think well…whats the big difference its one position…the difference between 3rd and 4th or 2nd and 3rd is huge.  The difference between 1st and 2nd position is more than 50%.  If this is a billion dollar industry we’re talking about potentially hundreds of millions of dollars. 

And it IS likely layout and speed.  Notice in the desktop view realtor.com has a higher trust score and a higher domain score.  Realtor.com has a higher rank, advertises more, was a domain first ….. in almost every SEO metric realtor.com should be higher than zumper.com.  BUT in mobile it is not.

Hits: 12

Political SEO (3)

I have to admit they are useful in my opinion.  Say you have 10 blogs that you do SEO for … you can use the RSS feed with an IFTTT Applet to auto generator your social media announcements of new content.  Rather than posting something to Facebook, Twitter, LinkedIn, Blogger, Tumblr, Instagram and so on and so on.

You can also ensure that your blog regularly gets updates which improve the likely hood of crawlers visiting your site more frequently.  It’s not the main attraction content that you would be pulling in but filler content that is still relevant.  Think of it like this … when you go to your local newspaper’s site likely all of the nation news is from the Associated Press.  That means that article is duplicate content but duplicate content isn’t as bad as folks make it out to be…it happens and in journalism it’s more the norm than the exception.

Sometimes I feel that RSS content aggregated well can create unique content.  An example is during an election I built a site that aimed to have all the local elections news in one spot.  VoteLouisville.com pulled the RSS feeds of anyone running for office which created a place where you could read two candidates updates side by side and compare what they were focusing on in the election.  The candidates want there content reposted, and the voters likely want to see both sides so in that scenario I think the content from RSS made uniquely different aggregated content.

Hits: 191

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1473

Political SEO is search engine optimization meant to affect a political outcome by ranking a political message for a specific intended audience before a future event.  The event is often an election.  Political SEO is commonly mixed with PPC or Pay Per Click advertising as SEO work generally requires time to build momentum while political campaigns are often short lived entities.

Political SEO may promote a candidate, party, idea, platform or agenda and intends to influence an audience’s perception of that focus.

Hits: 37

Category: Political SEO

SEO (22)

Domain Authority is a Moz.com metric meant to predict a domains ability to rank content on a search engine.  It relies heavily on backlinks and anchor text just like Google.  DA is a number between 0 to 100.  It is increasingly harder to increase your domain authority.  So a site may have 20 backlinks from 20 different sites and it earns a 10 DA but when that site has 40 domains linking to it its not going to have a 20 DA it might have a 15.

The Law Of Diminishing Returns is really easily seen in this topic.  Over a 4 month period a DA33 site increased its linking domains from 400 to 800 and its DA increased to DA40.  That was 7 points for double the domains.  This site on 2/1/2019 has a DA41.  A month ago this site had a DA40.  In the last 30 days this site added about 34 domains linking to it for a total of 854.  The other site previously mentioned had about 800 when it became a DA40 as well.

An indepth discuss is included in an article on the site.  Domain Authority, Why Should I Care?

Hits: 26

Category: SEO

Meta Descriptions In Google Search Results

170 Meta Description Character Limit

There’s debate on this … but primarily its accepted that about 170 characters is the full length Google displays.  It used to be a little lower and that’s why from time to time you may see 150 or 160 listed as the limit.  Those lower numbers though are based on outdated info.

Writing A Good SEO Meta Description

Tons of programs and most free audit tools will check this for you but it’s also not as important as the description being:

  • There
  • Unique
  • Relevant
  • Catchy

A lot of sites leave these behind and never declare a description which leaves Google to create one for you, and its usually just the first couple sentences of an article.  This is pretty bad for your conversion rate in organic search.  Why check out the content if you author didn’t even bother to provide a decent description?

In the same thought your description should be unique.   What makes this page worth clicking on?  What makes this page any different from any other page on your site?  If your descriptions aren’t unique then you don’t really make a case for why that specific page exists.

And while you may be selling something, keep in mind your description is not an advertisement for your business it is an ad for that page.  This page for example should talk about SEO FAQs, even though I sell SEO Services and not SEO FAQs.   A searcher seeking out SEO FAQs isn’t going to click on a description of the services I offer and a person seeking SEO services isn’t going to care much about meta descriptions if they are looking to contract that worry out to someone else.

Finally your description should be catchy… it needs to call out among the others and get someone to click on it.  Overly technical descriptions will discourage someone seeking knowledge about that topic…so keep it simple.  You dont need to prove your wealth of knowledge in the description as a person who is searching for information doesn’t know what you’re talking about likely … why would they be searching the topic if they already knew it?

 

What’s better for a page’s description with just this SEO FAQ on it?

Now each of the descriptions are less than 170 characters but one is a good description and the other is crap.  If you’ve just Googled “Meta Descriptions” I’m betting you’re going to appreciate the second one better.  And thats what the page is about.  The first muddles the water up by mixing the page with the sites purpose and then advertising the credentials of the owner.

Another thing you may have noticed between 1 and 2 is capitalization.  Research has shown that capitalizing the first letter of each word increases CTR.  Its more eye catching but ITS NOT ALL CAPS so its not in your face.

Google Search Result Using Meta Tags

And by the way…if you go over the length its not the end of the world.  There’s no penalty there is just   …

 

Hits: 7

Category: SEO

are we there yet?

Likely this is the most annoying question I get but I get it…we all wanna know the future.  Anyone who tells you a set time table is lying or doesn’t realize they don’t know.  We can guestimate…and I mean guess estimate it.  Cause for us to make an accurate quote on how long it will take we have to know what other people may be doing on the same keyword in the future and often they dont even know they are going to rank for that keyword a week from now.

You have to understand with 200 factors coming together in your ranking there are like thousands with there own 200 ranking factors that are going to play into the answer.  Sometimes unrelated things like adding a huge blog post on something else with poorly optimized images can slow down the whole site and that speed decrease may be enough to move someone else a spot higher.

In broad general terms I’d say with backlink building from relevant sites using the correct anchor text, users clicking your search result more than the one above you and with some good content thats optimized you can rank at the top in less than a month for easier long tail keywords.  It its a difficult keyword that has competition surrounding it Id expect it to take longer and Id be prepared for fluctuation some days you may lose some ground don’t get upset. The thing I’d focus on is the trend…is it climbing and can you keep up that pace?  Here is a set of keywords I use to serve as a baseline for this site…I think the trend is the most important thing.

keyword ranking trend

Hits: 22

Tag: trend

Hits: 20

Category: SEO

Let me show you how important it is….

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the left?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

Zumper.com Passes Speed

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

Hits: 9

So…two cost centers here ….

The steps you take to address the tech issues…often they should pay for themselves. For example. I worked for a company that did like 20 million a year in sales, so not a mom and pap shop but a company. B2B data analytics kinda folks..so also dealing in some big deals likely.

They were using GoDaddy shared hosting …. I think I used that for a summer in my college days for hosting my personal blog about my life on campus. Thats NOT appropriate for an enterprise and it shows easily in page speed.

They paid like $5 a month. I made them a Digital Ocean 1 cpu 1gb ram virtual private server for $5 a month. Same cost …. but 12 second load times turned into 5 second. We also signed up for Cloudflare’s CDN free plan and saw another 2 second decrease.

So what was the cost?

Well me. Everything cost the same as when we started. But now they could deliver a site in 1/4 the time. Sure they paid me for my time but I’m pretty cheap. Imagine the boom that gave their conversions, site traffic, repeat visitors and ultimately sales. They probably made up for my time in one sale..

So Technical SEO should be seen as a revenue generator. And the costs associated can likely be improved for the price of a tank of gas. Use linux OS, Open Source Software, Virtual infrastructure and you’ve spent next to nothing.

Its just like investing in a coffee maker if you’re starbucks, with out it there isn’t any business.

Return To FAQs 

Hits: 3

Categories: SEO, Technical SEO
Tag: budget

Where are the best places to share articles that will help increase my site’s DA?

First … if you want to increase your DA you can but keep in mind thats a Moz dot com metric with no direct connection to Google page rank. DR AS DS CF TF are all also metrics that are proprietary and DA is really only useful when you consider it along with other metrics.

But so a primary decider in both of those metrics is referring domains on seperate ips that get traffic and are indexed on Google. Easy answer … every place you can put it. Now it’s important we note, we aren’t talking about just a link….but you noted an article, which is the way to build context around your link and build relevance. In content links … don’t come up in a menu, the header or the footer and sidebar but are in the body in the content.

So many domains…a guest post is probably the way to go. A guest post on a syndicated network is viable … it is not a pbn. A PBN is just a collection of sites that no one goes too, that don’t rank themselves for anything and are full of links.

You’re welcome to guest post on my site at Ultimate SEO Content Syndication Network pick the category and it gets approved for syndication. Once syndicated your article gets posted on various sites such as ultimate seo dot org which is my main site or my personal blog … or about 30 others each with their own unique purpose and audience. Now if you post to an obscure category it will go to less sites than say business. Just cause for instance a dental article wouldn’t make sense on my personal blog.

But thats what you’re after, as many unique viable sites that don’t all share the same ip that have some SEO metrics of their own.

Everyone wants links from sites like Home Page – MarketWatch or FORTUNE and they’re great to have but a natural link distribution would mean you have more mediocre sites than high ones…just high ones and it looks like you bought them….which you can but thats expensive.

Check out places to post an article that syndicate the content.

And just to make a point on content syndication vs a pbn cause some might not get the difference … The Associated Press is not a PBN but their content is on every newspaper’s website. Getting an article about you from the AP is great. Now where folks get weird is when we think of who gets credit for that article, if it’s printed on 100 different sites. That’s fine too though, Google figures out where it was first rather easily. A misconception about “Duplicate Content” is rampant in the SEO world and Google has said many times as far back as 10 yrs ago that Duplicate Content “isn’t a pressing concern.” There’s also no penalty to it and that concerns traffic credit to the page, which you as a Guest Author don’t need to worry about since it’s not your site anyhow.

Hits: 3

Categories: SEO, SEO Content

They are just longer phrases with a more exact intent by a searcher.  For instance SEO isn’t a long tail keyword.  Ultimate SEO isn’t either but its twice as close to it.  Ultimate SEO FAQs is just about there and you can see that its more and more specific… Ultimate SEO FAQs On Long Tail Keywords …. now thats a long tail keyword.

They are searched less but they are much more specific and its easier to rank well for them with the right content.

Hits: 27

Category: SEO

SEO can be divided into subdomains or topics in various ways. Often On-Page and Off-Page is an easy distinction. With things like keywords being “On Page SEO” and Backlinks being “Off Page SEO.” Additionally we could consider Technical SEO a distinct branch of SEO dealing with server technology, site speed and site structure.

Hits: 4

Category: SEO

Theres likely debate and I don’t claim to have tested everything out there but I’ve found this mix to be good for me.

WPA Auto Links – It auto adds links in your content when it recognizes a phrase you have already covered so it adds an internal link for you.

Yoast – Hands down the best most configurable SEO plugin for what you want people to see in search results. I rarely use the keyword tool where it says to add this word here or there.  I also prefer xml site maps from a different plugin.

XML Site maps And Google News plugin just makes the files I want exactly the way I want.

Schema is another plugin that Yoast also does but I prefer a plugin that does it really well.

CyberSyn – For RSS feeds to posts, it gives the most features of any plugins I know of on WordPress for RSS to post.

Ultimate FAQs – I am a fan of FAQs all the time and anytime.  I like that in this plugin they are all on one page together but a permalink exists to the specific FAQ on its own as well giving you the flexibility of serving it a la cart.

Smush – Image optimization is essential and without a plugin to ensure you’re not showing a 2000 x 2000 image in a 200 x 200 spot you’ll quickly lose control of your site’s load times.

Merge Minify Refresh – You’ll have 25 css and 20 js files on every page and this plugin helps tackle that with out breaking everything.  I am NOT a fan of Auto-Optimize I usually find myself having to recover from backup after it auto does its thing.

404 to 301 – I like to 404 redirect to a search page that is likely to help someone find what they are after rather than just drop them on a 404 error page or the homepage.

WP-Optimize – I prefer this tool … always backup first but it does a good amount of cleaning and helps keep your database tables neat.  Neat means faster page load times.

Hits: 18

Categories: SEO, Technical SEO

Great question…here is a video from them.  It is worth watching because it provides good advice for hiring an SEO and sets expectations.

Hits: 21

Category: SEO

Hits: 23

Category: SEO

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1473

What is SEO based on?

SEO is a mutable, always changing field of half technology and half marketing. To my knowledge their are no independent third party nonprofit certifications offered in this expertise. I mention that because with 40 billion a year spent on SEO efforts you’d think there would be an unbiased professional group like ASTD is for training or CompTIA is for technology.

SEO is honestly the wild west. I’d say half of it is a total waste of time and money with folks operating on pre-2010 notions of repeating their keywords over and over on a page and expecting to rank. A lot of your DIY SEO efforts fall in to this area.

From there, 25% is likely On Page optimization which is not much better than the first part. Correctly placing optimized text here and there, insuring the headers are right and the alt-img tag has something … is the page title too long?

Paying no regard to the fact no one links to that content and it loads so slow that Google’s not going to rank it on the first 4 pages.

Finally we have 25% for Off Page … of which I’d bet half of this quarter is done incorrectly.

In the end, it’s the 13% of SEO effort that actually is relevant, executed properly and noticed by Google.

But in the end what all of this really comes down too is attempting to reverse engineer Google. Sure there are other search engines … ya right. No one is really trying to reverse engineer Bing…its a value added option to Google, the primary focus of anyone’s SEO efforts.

Officially, I should say the basis of SEO is on creating the best content that visitors will love to backlink too, that’s on the fastest, most secure network. But … its still reverse engineering Google.

Sometimes I have to remind clients … that backlink we just spent a week trying to get, it’ll provide us 0 traffic and with that … not one conversion. But that’s not really who the backlink is meant for … its there for Google to see.

We want local citations and mentions because they impress Google and elevate our search ranking which in turn drives traffic. No one really should expect traffic from the local chamber of commerce minute notes that links to us…but hey thats a great local search backlink that would be in content and from a trusted source.

No certificate program could really be made for SEO as the course content would be obsolete by the time the course is written, promoted, taken and a certified SEO gets a job.

That’s what makes it a field that you have to experiment with and be open to adapting the way you do things day by day.

For more reading we have an article on the site discussing SEO SEM SMM and SMO and whats the difference.

And when I said it is the wild west … it really is volatile.  Heck look at this tool, ask yourself how many other industries have something like a daily scale letting them know how much upheavle there is in Google rankings for the day.

 

Hits: 8

Category: SEO
Tags: Quora, SEO, What is

SEO Traffic

Search Engine Optimization or SEO is a broad term that covers actions intended to improve a web site or web page’s visibility within one or more search engines.  Its ever evolving as the search engines change and the way they rank is updated.

In the earliest days of the internet Keywords and Keyword stuffing rained king.  As search engines have evolved they no longer use old tools such as the meta keywords or keyword density. Keywords are still important but its their contextual use and placement that now matters.

SEO includes “off page” topics such as anchor text, backlinks and technical SEO.   In all 200 factors go into consideration by Google when ranking sites for searches and each of these factors is arguably a subsection of SEO.

SEO can further be divided by Whitehat or Blackhat with many SEOs falling into a hybrid Greyhat area.  Blackhat SEO is considered fast and risky with Whitehat meeting the requirements of a search engine and seek long term growth that may take some investment of time and money.

Hits: 21

Category: SEO

What is the best SEO audit tool?

So thats like asking whats the best tool for building a house?

It depends.

Just like in building a house the right tool for the job is essential. So as an SEO, BSIT MBA MEd whos “top rated” by clients on Upwork and has tested in the top 10% there heres how I see it.

No automated free audit tool is really that much better than another because they cant accurately audit the 200+ signals that we suspect Google uses, because of course Google does not publish that. These tools are usually just lead generating gimmicks intended to gain your contact info so someone can actually follow up with ya. Now they are useful …. in like a heart monitor is useful. They cant diagnose the problem but they can be an indicator of something catastrophic.

I divide audits into groupings and each has a tool set.

  1. User Behavior – The most important factor…this includes bounce rate, how many pages they view, if they return, whats the CTR in Google Search Console…these are the most important signals. And it makes sense….if a page doesn’t include a phrase on the page but every time Google lists it first for a specific search everyone clicks it and stays on that site for an hour …. its safe to assume the site is relevant to the user’s search. So in this one I think the best tool is Google Search Console and then Google Analytics. You can combine the data in Data Studio for free.
  2. Backlinks and Mentions (What others think of the site) A Little less important than user behavior but a heck of a lot more important than meta tags and keywords are the off page signals. For this Google Search Console is a good tool again. Now if you’d like a site that breaks this into more then you have to consider that SEMRush .com SEOProfiler .com Moz .com and HREFS .com all crawl the web on their own. They are always going to have a smaller sample size than Google. So all of them are only as good as how fast they can crawl the web and start all over again. With that said I go with SEMRush .com and SEOProfiler .com as they are reasonably priced and I always use at least 2 because of the smaller sample size. You notice the often disagree on what site has how many backlinks and that’s fine I just average them in my head… I also like Monitor Backlinks for just monitoring backlinks.

A couple notes here … DA Domain Authority is no more accurate than Citation Flow or Trust Flow they aren’t accurate. Neither can fully predict Google’s algorithms and focusing on any one of these is as effective as chewing gum to treat a heart attack. Taken together they become more accurate…so know all three and SEMRush offers you DS with LIS from SEOProfiler so there are plenty of measures that eyeball it … but don’t get fixed on anyone. Its like looking over your three credit reports and trying to come up with a FICO score.

  1. The least important part is the most thought of part and the one that gets too much attention…the on page audits of a site. Any of the tools already mentioned with audit the site for broken links and error pages…so just use anyone of them. I use SEOProfiler then SEMRush’s audit but I also like WebSite Auditor as its available offline. On page though is the least accurate indicator to the relevancy of a page to a search. I’d give it maybe 30% of a site’s signals.
  2. Finally the often forgotten but important … especially in the world of mobile. ( Don’t forget over 50% of searches are mobile and Google has a separate index for them ) is Technical SEO. How long does the page take to load. GTMetrix .com is the tool I use for that.

Hits: 3

Categories: SEO, SEO Audits

Well so DA or Domain Authority is a proprietary guess by Moz dot com as to a domains ability to rank for a relevant keyword. It attempts to mimic Google PageRank, which is also proprietary so no one knows what’s exactly in it. We know it has 200 ranking factors and I’ve actually done an article on this on my own site but to be quicker …

its primarily a measure of the number of domains that link to your site, how old your site is and how much social buzz there is … now DA is just one measure. DR DS AS CF TF all are made by someone else but all try to mimic Google PageRank which is not publically shared.

There are hundreds if not thousands of free sites that give out DA. The just have to be linked up to Moz to pull the number they assign. There are also many bulk DA lookup tools out there for free.

I personally use Domain Detailer by Domain Hunter Gatherer. (I don’t work with or for them, I honestly just like their product) its great cause it gives you DA but in a useful way. DA by itself is worthless…only has value if you believe it too, cause its just their guess.

Domain Detailer also gives you Moz Backlinks, Majestic Backlinks count, Majestics Domain count, Majestics IP addresses count, CF TF Moz Rank, Page Rank and more…auto does it all…does it in bulk and you buy credits ahead of time. I think I paid like 30 bucks and can use it 10000 times or something.

Learn more about these SEO Metrics in these two articles:

SEO Metrics: DA DS DR DT CF TF DP AS and domain authority ranking.

Is A Site Low Level? DA vs CF vs TF … DR, DS, AS, TS and more!

Hits: 5

Categories: SEO, SEO Audits

Hits: 22

Category: SEO

Auto Blogging is the practice of increasing content across sites usually with the same content but sometimes with subtle changes in the wording.  It’s done usually to pad content and it’s similar to syndicating news or guest postings that may appear on other sites.

Whats the issue with it?  Well it cant be the most unique, fresh content if you’re auto blogging it.  Even with the use of synonyms and word spinners the content is below grade and could carry duplicate content.  You should ensure that the content isn’t plagiarized and give credit to the author.  Canonical tags begin to play into this more where you get the use of the content but you provide the SEO value and credit back to the author.

Whats wrong with spinner content?  Its not natural … consider the sentence … Its the day after Valentine’s Day here and I’m just hanging out with my dog.  I got back “Its the after a long time after Valentine’s Day here and I’m simply spending time with my puppy.” from Free Article Spinner.com.  There is a mishap in the beginning of the sentence now and now “I’m simply spending time with my puppy.” … how would it not be simply?  Or how about another test sentence…

I prefer doing technical SEO audits to local SEO work just because Dallas, Houston and Los Angeles are all very different from Louisville or Chicago.

I incline toward doing specialized SEO reviews to nearby SEO work since Dallas, Houston and Los Angeles are for the most part altogether different from Louisville or Chicago.

Get the point on Auto Blogging?  I’d say you’re better off rewriting it yourself and now its not auto blogging.

Hits: 19

Categories: SEO, SEO Content

James Cameron does what James Cameron does because James Cameron is James Cameron.

That is keyword stuffing.  It’s easy to spot and pretty obvious.  Think of a shameless self promotional tactic that just forces a keyword needless into a page’s content.  It distracts visitors at best and at worst it hurts your SEO value.

Hits: 25

Category: SEO

In the early 2000s.  Page Rank isn’t a publicly available factor anymore according to Google.  You can monitor your page and domain using other metrics such as Domain Authority, Citation Flow and Trust Flow.  These metrics are 3rd party measures of what they expect Google to see your site as for ranking keywords.  They are not Google’s metrics.  There are additional competing metrics from other sites as well and you should be aware of them all.  Test them out and decide for yourself which numbers work best for you.

Ultimate SEO uses Domain Authority and Trust Flow along with Spam Score to help build a base depectition of a page’s ability to rank a keyword.

Hits: 25

Category: SEO

Keywords first and foremost should be located where they make sense.  Don’t force them into a place just because you want that keyword.

With that said, Keywords should be in the page title, description, Img Alt tags, headers, and a few times in the content.  Since each page needs to be unique I recommend focusing on a main keyword per page or post. That way you can focus on that keyword and all of the content on the page is relevant to the specific keyword.

Hits: 21

Category: SEO

SEO Audits (10)

Best plan is to hire me to do a Technical Audit and let me implement my recommendations.  Too easy right?  Okay…so then you need to establish a baseline go to gtmetrix.com and put in a targeted page you’d like to optimize.

gtmetrix technical seo tool

Okay…so I’m next to perfect here but it wasnt always that way.  I used to have a slider that had 5 large images in it at the top of my homepage.  That slider required a dozen or so javascript files and css files to load.  Since it was the first thing you saw it meant you had to wait for it before the page began to function.  After that the page had images and more javascript and more css files.  Each file on a page is a request… your computer has to ask the server for it, the server may take a moment to respond with the file, then you have to put it in place on the page and multiple that process by a hundred and you see a chance to improve your site speed.

Additionally combining files cuts down on the time required.  I have 8 or so certification icons on the homepage but I merged them into 2 images instead of 8.  Here is an example…

technical certifications

We can also improve Technical load times by getting a server that is dedicated to this site.  You might try a micro server on AWS or Google Cloud Platform they cost hardly anything but the resources are available for your site when you need them.  Cloudflare also offers speed advantageous through a cache and its CDN network.  Theres more but these items should be enough to start you on the road to at most a 3 second load time.  Thats honestly where you have to draw the line if you want to rank a keyword against competitors who are willing to do what it takes to load within 3 seconds.

Hits: 21

are we there yet?

Likely this is the most annoying question I get but I get it…we all wanna know the future.  Anyone who tells you a set time table is lying or doesn’t realize they don’t know.  We can guestimate…and I mean guess estimate it.  Cause for us to make an accurate quote on how long it will take we have to know what other people may be doing on the same keyword in the future and often they dont even know they are going to rank for that keyword a week from now.

You have to understand with 200 factors coming together in your ranking there are like thousands with there own 200 ranking factors that are going to play into the answer.  Sometimes unrelated things like adding a huge blog post on something else with poorly optimized images can slow down the whole site and that speed decrease may be enough to move someone else a spot higher.

In broad general terms I’d say with backlink building from relevant sites using the correct anchor text, users clicking your search result more than the one above you and with some good content thats optimized you can rank at the top in less than a month for easier long tail keywords.  It its a difficult keyword that has competition surrounding it Id expect it to take longer and Id be prepared for fluctuation some days you may lose some ground don’t get upset. The thing I’d focus on is the trend…is it climbing and can you keep up that pace?  Here is a set of keywords I use to serve as a baseline for this site…I think the trend is the most important thing.

keyword ranking trend

Hits: 22

Tag: trend

This is a solid reasonable question…that unfortunately requires a sliding scale answer.  Since your Google ranking is made up of 200 factors and Domain Pop is just one of these no exact number can be used to determine your rank. First…whats the quality of the ranking domains?  100 no name domains that you just created aren’t worth one link from huffingtonpost.com

I would suggest there is a rough rule of thumb…

DA 10 = less than 50

DA 20 = 200

DA 30 = 400

DA40 = 800

DA50 = 2000

 

These are not exact numbers but just rules of thumb that I’ve found helpful.  Basically though for every ten more points in DA the number of domains may double. PBN sites and domains with little to no traffic are maybe only worth a fourth of a domain…so if yiu have 800 pbn sites linking to you ….Id say you may expect a DA of about 10 to 20.  If these 800 are solid real domains, youre DA should be 35 – 40.

 

 

Hits: 32

Quick Answer: Yes

Long Answer: Yes, they don’t do the same thing…Google wouldn’t have made a tool twice.

Google Analytics tracks your visitors and their experience and behavior on your site regardless of where they came from or how they got there. That means Bing traffic is counted, it means traffic attainted directly, someone types the address in the address bar … they are counted as well.

Google Search Console: Tracks the queries by searchers and how you ranked for those and includes data on your query performance when you weren’t selected from the results.

  • Google Analytics tracks visitors to the site no matter where they came from.
  • Google Search Console – tracks the queries you were displayed for and how often Google Searchers selected you as the result they wanted for that query.

So no Bing traffic in Google Search Console and no keywords and position ranking for anyone who didn’t go to your site in Google Analytics.

Two products, two different audiences, two different uses …. different data.

For fun and for transparency I often use Google Data Studio to pull those two products in together for ya’ll.

You can tie the two together if you’d like data shared…


Access Search Console data in Google Analytics

This feature is currently supported only in old Search ConsoleOpen old Search Console now.

If you associate a Google Analytics property with a site in your Search Console account, you’ll be able to see Search Console data in your Google Analytics reports. You’ll also be able to access Google Analytics reports directly from the Links to your site, and Sitelinks pages in Search Console. Note that you can only associate a website; you cannot associate an app.

Associate PropertiesYou must be an owner of the Google Analytics property to be able to associate it with a website in Search Console.

You can open the Google Analytics association page from the  property settings dropdown in Search Console.

When you associate a site in your Search Console account with a Google Analytics property, by default Search Console data is enabled for all profiles associated with that property. As a result, anybody with access to that Google Analytics property may be able to see Search Console data for that site. For example, if a Google Analytics admin adds a user to a profile, that user may be able to see Search Console data in Search Optimization reports.

A site can be associated with only one property, and vice versa. Creating a new association removes the previously existing association.

Every Google Analytics property can have a number of views. When you associate a site with a property, clicking a link to Google Analytics from Search Console will take you to that property’s default view. (If you previously associated your site with a different view, clicking a link will now take you to the default view instead. If you want to see a different view, you can switch views from within Google Analytics.)

If your site is already associated with a Google Analytics property, it could be for a couple of reasons. Either you already used Google Analytics to associate this property with the site, or another site owner has made the association.

If your site is associated with an Analytics property you don’t recognize (and don’t want), it may be because another site owner associated the site with an Analytics property you don’t own. In this case, you can delete the association and create a new one.

If your site used to be associated with a property, but no longer is, it may be that the property was later associated with a different site. (Remember, a site can be associated with only one property. Creating a new association will remove the previously existing association.)

You can also create association using the Analytics admin page if you’re an account administrator for the Google Analytics property.Google Analytics account administrators can move their Analytics property from one Analytics account to another. Any associated Search Console properties will maintain their association as part of the move. After the move, any users of the new Analytics account will be able to see data from the associated Search Console property without a notification in Search Console. Learn more.

Removing Search Console data from Google Analytics

To remove Search Console data from a Google Analytics property, unlink the association using Search Console’s association page, or manage association in the  Analytics admin page (if you’re an administrator for the Google Analytics property).

Why doesn’t Search Console data match Google Analytics data?

Search Console data may differ from the data displayed in other tools, such as Google Analytics. Possible reasons for this include:

  • Search Console does some additional data processing—for example, to handle duplicate content and visits from robots—that may cause your stats to differ from stats listed in other sources.
  • Some tools, such as Google Analytics, track traffic only from users who have enabled JavaScript in their browser.
  • Google Analytics tracks visits only to pages which include the correctly configured Analytics Javascript code. If pages on the site don’t have the code, Analytics will not track visits to those pages. Visits to pages without the Analytics tracking code will, however, be tracked in Search Console if users reach them via search results or if Google crawls or otherwise discovers them.
  • Some tools define “keywords” differently. For example:
    • The Keywords page in Search Console displays the most significant words Google found on your site.
    • The Keywords tool in Google Adwords displays the total number of user queries for that keyword across the web.
    • Analytics uses the term “keywords” to describe both search engine queries and Google Ads paid keywords.
    • The Search Console Search Analytics page lists shows the total number of keyword search queries in which your page’s listing was seen in search results, and this is a smaller number. Also, Search Console rounds search query data to one or two significant digits.

Hits: 25

Let me show you how important it is….

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the left?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

Zumper.com Passes Speed

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

Hits: 9

First off …. honestly take pretty much everything you know about duplicate content and forget it.  You will have duplicate content and thats alright…its normal.  The Associated Press is a highly syndicated news source…do you think Google penalizes every news site that includes the AP content?

Think I’m talking crazy?  Look for information from Google on Duplicate Content…ignore info from anyone else.  To save you a step…heres me duplicating content….copying from Google https://support.google.com/webmasters/answer/66359?hl=en

 

Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin. Examples of non-malicious duplicate content could include:

  • Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices
  • Store items shown or linked via multiple distinct URLs
  • Printer-only versions of web pages

If your site contains multiple pages with largely identical content, there are a number of ways you can indicate your preferred URL to Google. (This is called “canonicalization”.) More information about canonicalization.

However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.

Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has a “regular” and “printer” version of each article, and neither of these is blocked with a noindex meta tag, we’ll choose one of them to list. In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.

There are some steps you can take to proactively address duplicate content issues, and ensure that visitors see the content you want them to.

    • Use 301s: If you’ve restructured your site, use 301 redirects (“RedirectPermanent”) in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.)

 

    • Be consistent: Try to keep your internal linking consistent. For example, don’t link to http://www.example.com/page/ and http://www.example.com/page and http://www.example.com/page/index.htm.

 

    • Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We’re more likely to know that http://www.example.de contains Germany-focused content, for instance, than http://www.example.com/de or http://de.example.com.

 

  • Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.

Hits: 22

Category: SEO Audits

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1473

What is the best SEO audit tool?

So thats like asking whats the best tool for building a house?

It depends.

Just like in building a house the right tool for the job is essential. So as an SEO, BSIT MBA MEd whos “top rated” by clients on Upwork and has tested in the top 10% there heres how I see it.

No automated free audit tool is really that much better than another because they cant accurately audit the 200+ signals that we suspect Google uses, because of course Google does not publish that. These tools are usually just lead generating gimmicks intended to gain your contact info so someone can actually follow up with ya. Now they are useful …. in like a heart monitor is useful. They cant diagnose the problem but they can be an indicator of something catastrophic.

I divide audits into groupings and each has a tool set.

  1. User Behavior – The most important factor…this includes bounce rate, how many pages they view, if they return, whats the CTR in Google Search Console…these are the most important signals. And it makes sense….if a page doesn’t include a phrase on the page but every time Google lists it first for a specific search everyone clicks it and stays on that site for an hour …. its safe to assume the site is relevant to the user’s search. So in this one I think the best tool is Google Search Console and then Google Analytics. You can combine the data in Data Studio for free.
  2. Backlinks and Mentions (What others think of the site) A Little less important than user behavior but a heck of a lot more important than meta tags and keywords are the off page signals. For this Google Search Console is a good tool again. Now if you’d like a site that breaks this into more then you have to consider that SEMRush .com SEOProfiler .com Moz .com and HREFS .com all crawl the web on their own. They are always going to have a smaller sample size than Google. So all of them are only as good as how fast they can crawl the web and start all over again. With that said I go with SEMRush .com and SEOProfiler .com as they are reasonably priced and I always use at least 2 because of the smaller sample size. You notice the often disagree on what site has how many backlinks and that’s fine I just average them in my head… I also like Monitor Backlinks for just monitoring backlinks.

A couple notes here … DA Domain Authority is no more accurate than Citation Flow or Trust Flow they aren’t accurate. Neither can fully predict Google’s algorithms and focusing on any one of these is as effective as chewing gum to treat a heart attack. Taken together they become more accurate…so know all three and SEMRush offers you DS with LIS from SEOProfiler so there are plenty of measures that eyeball it … but don’t get fixed on anyone. Its like looking over your three credit reports and trying to come up with a FICO score.

  1. The least important part is the most thought of part and the one that gets too much attention…the on page audits of a site. Any of the tools already mentioned with audit the site for broken links and error pages…so just use anyone of them. I use SEOProfiler then SEMRush’s audit but I also like WebSite Auditor as its available offline. On page though is the least accurate indicator to the relevancy of a page to a search. I’d give it maybe 30% of a site’s signals.
  2. Finally the often forgotten but important … especially in the world of mobile. ( Don’t forget over 50% of searches are mobile and Google has a separate index for them ) is Technical SEO. How long does the page take to load. GTMetrix .com is the tool I use for that.

Hits: 3

Categories: SEO, SEO Audits

Well so DA or Domain Authority is a proprietary guess by Moz dot com as to a domains ability to rank for a relevant keyword. It attempts to mimic Google PageRank, which is also proprietary so no one knows what’s exactly in it. We know it has 200 ranking factors and I’ve actually done an article on this on my own site but to be quicker …

its primarily a measure of the number of domains that link to your site, how old your site is and how much social buzz there is … now DA is just one measure. DR DS AS CF TF all are made by someone else but all try to mimic Google PageRank which is not publically shared.

There are hundreds if not thousands of free sites that give out DA. The just have to be linked up to Moz to pull the number they assign. There are also many bulk DA lookup tools out there for free.

I personally use Domain Detailer by Domain Hunter Gatherer. (I don’t work with or for them, I honestly just like their product) its great cause it gives you DA but in a useful way. DA by itself is worthless…only has value if you believe it too, cause its just their guess.

Domain Detailer also gives you Moz Backlinks, Majestic Backlinks count, Majestics Domain count, Majestics IP addresses count, CF TF Moz Rank, Page Rank and more…auto does it all…does it in bulk and you buy credits ahead of time. I think I paid like 30 bucks and can use it 10000 times or something.

Learn more about these SEO Metrics in these two articles:

SEO Metrics: DA DS DR DT CF TF DP AS and domain authority ranking.

Is A Site Low Level? DA vs CF vs TF … DR, DS, AS, TS and more!

Hits: 5

Categories: SEO, SEO Audits

Volume and because Google says it is.  That’s honestly the best answer…now for the why….

  • More than half of the searches on Google are from mobile devices.  If you only think in terms of searching from a desktop you ignore the majority of searches.  A mobile device usually displays content in a small limited width view, desktop pages are not appropriate for a good mobile experience.  Mobile’s load generally slower as they lack a physical connection to a network.  Pages need to load fast and thus need to be streamlined for the best experience, page load speed is therefore extremely important.
  • Mobile First – Is an initiative Google started in 2018.  The name is self explanatory.  If you don’t want to put mobile first you’ll ignore the greatest customer you have….Google.
  • Mobile Index – to address the disparity between the mobile experience and a desktop experience Google created a second index of results thats independent of the desktop results.  You could be the number one result in desktop but not listed in mobile.  Since we noted more than half of searches are on mobile devices you would lose over half the opportunities for reaching a visitor.

Ignore mobile devices at your own peril.  Your SEO efforts are doomed in my humble opinion.  I don’t work on projects I dont think can succeed so I’d bow out if working on a project ignoring mobile.

Here is an example …. searching “rental homes in louisville” you’ll notice that the results come in a different order.

desktop vs mobile search results

It appears while realtor.com is considered better for desktops, zumper.com has a better mobile experience.  Its likely better laid out and faster than realtor.com so it beats that site in mobile.  Who wins here?  Zumper as more searches on mobile than desktop.  You think well…whats the big difference its one position…the difference between 3rd and 4th or 2nd and 3rd is huge.  The difference between 1st and 2nd position is more than 50%.  If this is a billion dollar industry we’re talking about potentially hundreds of millions of dollars. 

And it IS likely layout and speed.  Notice in the desktop view realtor.com has a higher trust score and a higher domain score.  Realtor.com has a higher rank, advertises more, was a domain first ….. in almost every SEO metric realtor.com should be higher than zumper.com.  BUT in mobile it is not.

Hits: 12

SEO Backlinks (17)

301 Redirect

301 Redirects are used when content that was located at one web address or URL is moved to another web address.  Its useful to ensure traffic is still served when using the old web address.  Consider:

A product page for for rat poison and what to do if accidentally ingested is available at the company’s website.  Poison Control googles that poison and finds the page with the instructions and bookmarks it for future reference. The company is bought out by a larger company who will rebrand the poison.  When it is rebranded the original site will be turned off.  Without a 301 redirect the poison control bookmark would just go to an error. With a 301 redirect the old link is redirected to the new site’s content.

302 Redirect

Is the same thing as a 301 Redirect but is temporary. If a search engine crawls a page and sees a 301 redirect they know to change the listing in their results to the new target.  A 302 redirect tells them not to make a change as this is only temporary. They are used less than 301 redirects.

 

Hits: 26

Category: SEO Backlinks

Hits: 22

Category: SEO Backlinks

Hits: 25

Category: SEO Backlinks

Answer: Does a bear shit in the woods?

PBNs are extremely effective and completely relevant and useful and used in 2019.

do pbns work still?

 PBNs in 2019

They may look different and work slightly different from when an SEO used a 10 site network in 2005…but they work.  If they didn’t work we wouldn’t have Google trying to destroy them.

Hits: 22

Category: SEO Backlinks
Tag: PBN SEO

Hits: 24

Category: SEO Backlinks

Hits: 19

Category: SEO Backlinks

are we there yet?

Likely this is the most annoying question I get but I get it…we all wanna know the future.  Anyone who tells you a set time table is lying or doesn’t realize they don’t know.  We can guestimate…and I mean guess estimate it.  Cause for us to make an accurate quote on how long it will take we have to know what other people may be doing on the same keyword in the future and often they dont even know they are going to rank for that keyword a week from now.

You have to understand with 200 factors coming together in your ranking there are like thousands with there own 200 ranking factors that are going to play into the answer.  Sometimes unrelated things like adding a huge blog post on something else with poorly optimized images can slow down the whole site and that speed decrease may be enough to move someone else a spot higher.

In broad general terms I’d say with backlink building from relevant sites using the correct anchor text, users clicking your search result more than the one above you and with some good content thats optimized you can rank at the top in less than a month for easier long tail keywords.  It its a difficult keyword that has competition surrounding it Id expect it to take longer and Id be prepared for fluctuation some days you may lose some ground don’t get upset. The thing I’d focus on is the trend…is it climbing and can you keep up that pace?  Here is a set of keywords I use to serve as a baseline for this site…I think the trend is the most important thing.

keyword ranking trend

Hits: 22

Tag: trend

This is a solid reasonable question…that unfortunately requires a sliding scale answer.  Since your Google ranking is made up of 200 factors and Domain Pop is just one of these no exact number can be used to determine your rank. First…whats the quality of the ranking domains?  100 no name domains that you just created aren’t worth one link from huffingtonpost.com

I would suggest there is a rough rule of thumb…

DA 10 = less than 50

DA 20 = 200

DA 30 = 400

DA40 = 800

DA50 = 2000

 

These are not exact numbers but just rules of thumb that I’ve found helpful.  Basically though for every ten more points in DA the number of domains may double. PBN sites and domains with little to no traffic are maybe only worth a fourth of a domain…so if yiu have 800 pbn sites linking to you ….Id say you may expect a DA of about 10 to 20.  If these 800 are solid real domains, youre DA should be 35 – 40.

 

 

Hits: 32

Quick Answer: Yes

Long Answer: Yes, they don’t do the same thing…Google wouldn’t have made a tool twice.

Google Analytics tracks your visitors and their experience and behavior on your site regardless of where they came from or how they got there. That means Bing traffic is counted, it means traffic attainted directly, someone types the address in the address bar … they are counted as well.

Google Search Console: Tracks the queries by searchers and how you ranked for those and includes data on your query performance when you weren’t selected from the results.

  • Google Analytics tracks visitors to the site no matter where they came from.
  • Google Search Console – tracks the queries you were displayed for and how often Google Searchers selected you as the result they wanted for that query.

So no Bing traffic in Google Search Console and no keywords and position ranking for anyone who didn’t go to your site in Google Analytics.

Two products, two different audiences, two different uses …. different data.

For fun and for transparency I often use Google Data Studio to pull those two products in together for ya’ll.

You can tie the two together if you’d like data shared…


Access Search Console data in Google Analytics

This feature is currently supported only in old Search ConsoleOpen old Search Console now.

If you associate a Google Analytics property with a site in your Search Console account, you’ll be able to see Search Console data in your Google Analytics reports. You’ll also be able to access Google Analytics reports directly from the Links to your site, and Sitelinks pages in Search Console. Note that you can only associate a website; you cannot associate an app.

Associate PropertiesYou must be an owner of the Google Analytics property to be able to associate it with a website in Search Console.

You can open the Google Analytics association page from the  property settings dropdown in Search Console.

When you associate a site in your Search Console account with a Google Analytics property, by default Search Console data is enabled for all profiles associated with that property. As a result, anybody with access to that Google Analytics property may be able to see Search Console data for that site. For example, if a Google Analytics admin adds a user to a profile, that user may be able to see Search Console data in Search Optimization reports.

A site can be associated with only one property, and vice versa. Creating a new association removes the previously existing association.

Every Google Analytics property can have a number of views. When you associate a site with a property, clicking a link to Google Analytics from Search Console will take you to that property’s default view. (If you previously associated your site with a different view, clicking a link will now take you to the default view instead. If you want to see a different view, you can switch views from within Google Analytics.)

If your site is already associated with a Google Analytics property, it could be for a couple of reasons. Either you already used Google Analytics to associate this property with the site, or another site owner has made the association.

If your site is associated with an Analytics property you don’t recognize (and don’t want), it may be because another site owner associated the site with an Analytics property you don’t own. In this case, you can delete the association and create a new one.

If your site used to be associated with a property, but no longer is, it may be that the property was later associated with a different site. (Remember, a site can be associated with only one property. Creating a new association will remove the previously existing association.)

You can also create association using the Analytics admin page if you’re an account administrator for the Google Analytics property.Google Analytics account administrators can move their Analytics property from one Analytics account to another. Any associated Search Console properties will maintain their association as part of the move. After the move, any users of the new Analytics account will be able to see data from the associated Search Console property without a notification in Search Console. Learn more.

Removing Search Console data from Google Analytics

To remove Search Console data from a Google Analytics property, unlink the association using Search Console’s association page, or manage association in the  Analytics admin page (if you’re an administrator for the Google Analytics property).

Why doesn’t Search Console data match Google Analytics data?

Search Console data may differ from the data displayed in other tools, such as Google Analytics. Possible reasons for this include:

  • Search Console does some additional data processing—for example, to handle duplicate content and visits from robots—that may cause your stats to differ from stats listed in other sources.
  • Some tools, such as Google Analytics, track traffic only from users who have enabled JavaScript in their browser.
  • Google Analytics tracks visits only to pages which include the correctly configured Analytics Javascript code. If pages on the site don’t have the code, Analytics will not track visits to those pages. Visits to pages without the Analytics tracking code will, however, be tracked in Search Console if users reach them via search results or if Google crawls or otherwise discovers them.
  • Some tools define “keywords” differently. For example:
    • The Keywords page in Search Console displays the most significant words Google found on your site.
    • The Keywords tool in Google Adwords displays the total number of user queries for that keyword across the web.
    • Analytics uses the term “keywords” to describe both search engine queries and Google Ads paid keywords.
    • The Search Console Search Analytics page lists shows the total number of keyword search queries in which your page’s listing was seen in search results, and this is a smaller number. Also, Search Console rounds search query data to one or two significant digits.

Hits: 25

Backlinks are links from other sites.  Think of them as votes of affirmation.  One vote can come from a domain so for SEO purposes it doesn’t matter if there are 100 or 1 link from the same domain one link is the count you gain.  Now those other links may increase traffic to your site but in regards to SEO its one vote.

The more domains that link to you the more authoritative you must be right?  Well Kinda.  If 1000 domains link to your site you likely are more authoritative than a site that 3 sites link too.  Not all domains or votes or backlinks….are the same.  A link to your site from UltimateSEO.org carries with it the weigh attributed to that site by its backlinks.  Different companies refer to the authority of a site differently.

 

You can see a site’s backlinks from many indexes most being paid.  Ultimate SEO recommends Monitor Backlinks if you want a tool that is really good at backlinks.  UltimateSEO received nothing for that endorsement.  The endorsement or vote …. as you see is a backlink.

Hits: 26

Category: SEO Backlinks

Hits: 33

Category: SEO Backlinks

Hits: 20

Category: SEO Backlinks

Hits: 24

Category: SEO Backlinks

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1473

Google provides us with a tool they call the Disavow Tool.  It allows websites to list backlinks that they don’t want counting as part of their site’s backlinks.  Some backlinks may harm your SEO efforts and are considered “spammy.”  But Google has said that they are pretty good now at realizing a poor quality backlink and not using it so you may not have to list these links at all.

You should only use the Disavow Tool if you know your site is being penalized.  You can find this out from Google Search Console.  Doing so prior to any issue may hurt your SEO if you disavow links that Google thought were good.

Hits: 25

Category: SEO Backlinks

Backlink Indexers are simply sites such as indexkings.com that claim to get your backlinks found by Google faster.  The jury is out on how useful they actually are but essentially they place backlinks to your backlinks.  Theory being if you have one page with a link you want Google to find you can do a lot better with 800 links to that backlink page.  Now it sounds reasonable but is 800 or 1 any different when we are talking about zillions of links on the web?

Ive used both free and paid indexers and ultimately I saw no real marked difference in Google indexing the page I had a backlink sitting on.  With that said please share opinions that are supportive if you disagree.  BTW…indexkings.com I linked above is a free site, so maybe you can create a study for us only costing you time.

Final thought, if you receive a good backlink its likely a popular site and Google will find it without issue.  If you paid for a link on some obscure site then sure indexing may be worth it?

Hits: 20

Category: SEO Backlinks
Tag: Indexer

Hits: 22

Category: SEO Backlinks

SEO Content (4)

I have to admit they are useful in my opinion.  Say you have 10 blogs that you do SEO for … you can use the RSS feed with an IFTTT Applet to auto generator your social media announcements of new content.  Rather than posting something to Facebook, Twitter, LinkedIn, Blogger, Tumblr, Instagram and so on and so on.

You can also ensure that your blog regularly gets updates which improve the likely hood of crawlers visiting your site more frequently.  It’s not the main attraction content that you would be pulling in but filler content that is still relevant.  Think of it like this … when you go to your local newspaper’s site likely all of the nation news is from the Associated Press.  That means that article is duplicate content but duplicate content isn’t as bad as folks make it out to be…it happens and in journalism it’s more the norm than the exception.

Sometimes I feel that RSS content aggregated well can create unique content.  An example is during an election I built a site that aimed to have all the local elections news in one spot.  VoteLouisville.com pulled the RSS feeds of anyone running for office which created a place where you could read two candidates updates side by side and compare what they were focusing on in the election.  The candidates want there content reposted, and the voters likely want to see both sides so in that scenario I think the content from RSS made uniquely different aggregated content.

Hits: 191

Where are the best places to share articles that will help increase my site’s DA?

First … if you want to increase your DA you can but keep in mind thats a Moz dot com metric with no direct connection to Google page rank. DR AS DS CF TF are all also metrics that are proprietary and DA is really only useful when you consider it along with other metrics.

But so a primary decider in both of those metrics is referring domains on seperate ips that get traffic and are indexed on Google. Easy answer … every place you can put it. Now it’s important we note, we aren’t talking about just a link….but you noted an article, which is the way to build context around your link and build relevance. In content links … don’t come up in a menu, the header or the footer and sidebar but are in the body in the content.

So many domains…a guest post is probably the way to go. A guest post on a syndicated network is viable … it is not a pbn. A PBN is just a collection of sites that no one goes too, that don’t rank themselves for anything and are full of links.

You’re welcome to guest post on my site at Ultimate SEO Content Syndication Network pick the category and it gets approved for syndication. Once syndicated your article gets posted on various sites such as ultimate seo dot org which is my main site or my personal blog … or about 30 others each with their own unique purpose and audience. Now if you post to an obscure category it will go to less sites than say business. Just cause for instance a dental article wouldn’t make sense on my personal blog.

But thats what you’re after, as many unique viable sites that don’t all share the same ip that have some SEO metrics of their own.

Everyone wants links from sites like Home Page – MarketWatch or FORTUNE and they’re great to have but a natural link distribution would mean you have more mediocre sites than high ones…just high ones and it looks like you bought them….which you can but thats expensive.

Check out places to post an article that syndicate the content.

And just to make a point on content syndication vs a pbn cause some might not get the difference … The Associated Press is not a PBN but their content is on every newspaper’s website. Getting an article about you from the AP is great. Now where folks get weird is when we think of who gets credit for that article, if it’s printed on 100 different sites. That’s fine too though, Google figures out where it was first rather easily. A misconception about “Duplicate Content” is rampant in the SEO world and Google has said many times as far back as 10 yrs ago that Duplicate Content “isn’t a pressing concern.” There’s also no penalty to it and that concerns traffic credit to the page, which you as a Guest Author don’t need to worry about since it’s not your site anyhow.

Hits: 3

Categories: SEO, SEO Content

Yes … already, come on!  An interesting thing happened once on a blog post I did while the article was on page two the image I used in the post and included a good img-alt tag for ended up ranking #1 for that keyword in image search.  It’s just less competitive and the point is that on page one a few images are usually shown and with the right image you can pull in more clicks than your page two article.

It’s also a requirement.  You see those are needed t=for screen readers to read to a blind person.Its part of your sites accessibility to people with ADA needs.  Which in the end a web crawler is like a blind person, so use the tag to show you’ve added yet another supporting element on the page for the keyword you’re after … and often others don’t use those tags so its your secret weapon sometimes.

Citation Flow  I used this image to convey Citation Flow’s uselessness and the img alt tag supports that.

Hits: 21

Category: SEO Content

Auto Blogging is the practice of increasing content across sites usually with the same content but sometimes with subtle changes in the wording.  It’s done usually to pad content and it’s similar to syndicating news or guest postings that may appear on other sites.

Whats the issue with it?  Well it cant be the most unique, fresh content if you’re auto blogging it.  Even with the use of synonyms and word spinners the content is below grade and could carry duplicate content.  You should ensure that the content isn’t plagiarized and give credit to the author.  Canonical tags begin to play into this more where you get the use of the content but you provide the SEO value and credit back to the author.

Whats wrong with spinner content?  Its not natural … consider the sentence … Its the day after Valentine’s Day here and I’m just hanging out with my dog.  I got back “Its the after a long time after Valentine’s Day here and I’m simply spending time with my puppy.” from Free Article Spinner.com.  There is a mishap in the beginning of the sentence now and now “I’m simply spending time with my puppy.” … how would it not be simply?  Or how about another test sentence…

I prefer doing technical SEO audits to local SEO work just because Dallas, Houston and Los Angeles are all very different from Louisville or Chicago.

I incline toward doing specialized SEO reviews to nearby SEO work since Dallas, Houston and Los Angeles are for the most part altogether different from Louisville or Chicago.

Get the point on Auto Blogging?  I’d say you’re better off rewriting it yourself and now its not auto blogging.

Hits: 19

Categories: SEO, SEO Content

SEO Metrics (3)

Moz.com created DA which stands for Domain Authority.  It was heavily tied to the number of DA domains that linked to your site but in 2019 DA2.0 came out which added a preference towards domains that received verifiable traffic.

So it used to be fine to have 3 sites linking to you that got 0 monthly users but now those sites are worth less than a site with traffic.

DA is a popular metric used widely in SEO.  Using it on its own, by itself is unreliable and does not reflect Google’s assessment.

Comparable Metrics that may be of value when reviewing a site’s standing are Domain Rank or DR, Domain Score or DS, Domain Pop or DP, Citation Flow or CF and more!

Hits: 14

Category: SEO Metrics
Tag: DA

Domain Rank, or DR is an SEO Metric meant to depict a site’s ability. DR was created by Majestic.

It is similar ro Domain Score or DS, Domain Authority or DA

Hits: 16

Category: SEO Metrics

Domain Pop or DP is the number of domains pointing to a site.  Its similar to IP or IP Pop which is the number of IPs pointing to a site.

Hits: 40

Category: SEO Metrics
Tags: DP, IP

Technical SEO (8)

Best plan is to hire me to do a Technical Audit and let me implement my recommendations.  Too easy right?  Okay…so then you need to establish a baseline go to gtmetrix.com and put in a targeted page you’d like to optimize.

gtmetrix technical seo tool

Okay…so I’m next to perfect here but it wasnt always that way.  I used to have a slider that had 5 large images in it at the top of my homepage.  That slider required a dozen or so javascript files and css files to load.  Since it was the first thing you saw it meant you had to wait for it before the page began to function.  After that the page had images and more javascript and more css files.  Each file on a page is a request… your computer has to ask the server for it, the server may take a moment to respond with the file, then you have to put it in place on the page and multiple that process by a hundred and you see a chance to improve your site speed.

Additionally combining files cuts down on the time required.  I have 8 or so certification icons on the homepage but I merged them into 2 images instead of 8.  Here is an example…

technical certifications

We can also improve Technical load times by getting a server that is dedicated to this site.  You might try a micro server on AWS or Google Cloud Platform they cost hardly anything but the resources are available for your site when you need them.  Cloudflare also offers speed advantageous through a cache and its CDN network.  Theres more but these items should be enough to start you on the road to at most a 3 second load time.  Thats honestly where you have to draw the line if you want to rank a keyword against competitors who are willing to do what it takes to load within 3 seconds.

Hits: 21

page speed 3 second rule

All of your sites pages should load within three seconds.  After five seconds the number of people still waiting for your page to load is about half of those who started. As your page takes longer to load, Google lowers your rank so it can serve visitors results that they’ll use.

A Lot of people discard this element of Technical SEO and a lot of people aren’t going to rank for the keywords they want.   Optimize your page content and server hardware to ensure pages are visible with in three seconds.

Test the top results for the keyword you’re hoping to rank and I can pretty much guarantee it loaded in under 3 seconds.  The top result for Barack Obama was his Wikipedia page. It loads in 2.8 seconds according to GTmetrix.   The top result for Ultimate SEO is SEO Ultimate Plus and they load in 3.9 seconds.  Our site loads in 3.4 seconds.

Page speed is one of 200 factors that go into Google ranking.  Its one of the easiest to change though.

 

Hits: 28

Category: Technical SEO

Let me show you how important it is….

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the left?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

Zumper.com Passes Speed

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

Hits: 9

So…two cost centers here ….

The steps you take to address the tech issues…often they should pay for themselves. For example. I worked for a company that did like 20 million a year in sales, so not a mom and pap shop but a company. B2B data analytics kinda folks..so also dealing in some big deals likely.

They were using GoDaddy shared hosting …. I think I used that for a summer in my college days for hosting my personal blog about my life on campus. Thats NOT appropriate for an enterprise and it shows easily in page speed.

They paid like $5 a month. I made them a Digital Ocean 1 cpu 1gb ram virtual private server for $5 a month. Same cost …. but 12 second load times turned into 5 second. We also signed up for Cloudflare’s CDN free plan and saw another 2 second decrease.

So what was the cost?

Well me. Everything cost the same as when we started. But now they could deliver a site in 1/4 the time. Sure they paid me for my time but I’m pretty cheap. Imagine the boom that gave their conversions, site traffic, repeat visitors and ultimately sales. They probably made up for my time in one sale..

So Technical SEO should be seen as a revenue generator. And the costs associated can likely be improved for the price of a tank of gas. Use linux OS, Open Source Software, Virtual infrastructure and you’ve spent next to nothing.

Its just like investing in a coffee maker if you’re starbucks, with out it there isn’t any business.

Return To FAQs 

Hits: 3

Categories: SEO, Technical SEO
Tag: budget

Theres likely debate and I don’t claim to have tested everything out there but I’ve found this mix to be good for me.

WPA Auto Links – It auto adds links in your content when it recognizes a phrase you have already covered so it adds an internal link for you.

Yoast – Hands down the best most configurable SEO plugin for what you want people to see in search results. I rarely use the keyword tool where it says to add this word here or there.  I also prefer xml site maps from a different plugin.

XML Site maps And Google News plugin just makes the files I want exactly the way I want.

Schema is another plugin that Yoast also does but I prefer a plugin that does it really well.

CyberSyn – For RSS feeds to posts, it gives the most features of any plugins I know of on WordPress for RSS to post.

Ultimate FAQs – I am a fan of FAQs all the time and anytime.  I like that in this plugin they are all on one page together but a permalink exists to the specific FAQ on its own as well giving you the flexibility of serving it a la cart.

Smush – Image optimization is essential and without a plugin to ensure you’re not showing a 2000 x 2000 image in a 200 x 200 spot you’ll quickly lose control of your site’s load times.

Merge Minify Refresh – You’ll have 25 css and 20 js files on every page and this plugin helps tackle that with out breaking everything.  I am NOT a fan of Auto-Optimize I usually find myself having to recover from backup after it auto does its thing.

404 to 301 – I like to 404 redirect to a search page that is likely to help someone find what they are after rather than just drop them on a 404 error page or the homepage.

WP-Optimize – I prefer this tool … always backup first but it does a good amount of cleaning and helps keep your database tables neat.  Neat means faster page load times.

Hits: 18

Categories: SEO, Technical SEO

Content Delivery Networks are used to speed up sites by removing the bottleneck created by having one server provide files. These added servers are then spread out over a geographical area bringing the files closer to end users. This network could be global or regional.

Cloudflare is an example of a CDN as well as Jetpack. A popular paid CDN is AWS’s Cloudfront. These servers are called Edge servers as they are on the edge of the companies network all over.

Hits: 10

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1473

Volume and because Google says it is.  That’s honestly the best answer…now for the why….

  • More than half of the searches on Google are from mobile devices.  If you only think in terms of searching from a desktop you ignore the majority of searches.  A mobile device usually displays content in a small limited width view, desktop pages are not appropriate for a good mobile experience.  Mobile’s load generally slower as they lack a physical connection to a network.  Pages need to load fast and thus need to be streamlined for the best experience, page load speed is therefore extremely important.
  • Mobile First – Is an initiative Google started in 2018.  The name is self explanatory.  If you don’t want to put mobile first you’ll ignore the greatest customer you have….Google.
  • Mobile Index – to address the disparity between the mobile experience and a desktop experience Google created a second index of results thats independent of the desktop results.  You could be the number one result in desktop but not listed in mobile.  Since we noted more than half of searches are on mobile devices you would lose over half the opportunities for reaching a visitor.

Ignore mobile devices at your own peril.  Your SEO efforts are doomed in my humble opinion.  I don’t work on projects I dont think can succeed so I’d bow out if working on a project ignoring mobile.

Here is an example …. searching “rental homes in louisville” you’ll notice that the results come in a different order.

desktop vs mobile search results

It appears while realtor.com is considered better for desktops, zumper.com has a better mobile experience.  Its likely better laid out and faster than realtor.com so it beats that site in mobile.  Who wins here?  Zumper as more searches on mobile than desktop.  You think well…whats the big difference its one position…the difference between 3rd and 4th or 2nd and 3rd is huge.  The difference between 1st and 2nd position is more than 50%.  If this is a billion dollar industry we’re talking about potentially hundreds of millions of dollars. 

And it IS likely layout and speed.  Notice in the desktop view realtor.com has a higher trust score and a higher domain score.  Realtor.com has a higher rank, advertises more, was a domain first ….. in almost every SEO metric realtor.com should be higher than zumper.com.  BUT in mobile it is not.

Hits: 12

Load More

Hits: 373