Home

These FAQs are always expanding and gaining topics. We encourage you to ask an SEO question if it isnt already included. Just visit submit an SEO FAQ and we’ll respond to you and if it seems relevant to many we’ll add it here.

Mobile SEO (3)

Let me show you how important it is….

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the left?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

Zumper.com Passes Speed

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

Hits: 0

Content Delivery Networks are used to speed up sites by removing the bottleneck created by having one server provide files. These added servers are then spread out over a geographical area bringing the files closer to end users. This network could be global or regional.

Cloudflare is an example of a CDN as well as Jetpack. A popular paid CDN is AWS’s Cloudfront. These servers are called Edge servers as they are on the edge of the companies network all over.

Hits: 0

Volume and because Google says it is.  That’s honestly the best answer…now for the why….

  • More than half of the searches on Google are from mobile devices.  If you only think in terms of searching from a desktop you ignore the majority of searches.  A mobile device usually displays content in a small limited width view, desktop pages are not appropriate for a good mobile experience.  Mobile’s load generally slower as they lack a physical connection to a network.  Pages need to load fast and thus need to be streamlined for the best experience, page load speed is therefore extremely important.
  • Mobile First – Is an initiative Google started in 2018.  The name is self explanatory.  If you don’t want to put mobile first you’ll ignore the greatest customer you have….Google.
  • Mobile Index – to address the disparity between the mobile experience and a desktop experience Google created a second index of results thats independent of the desktop results.  You could be the number one result in desktop but not listed in mobile.  Since we noted more than half of searches are on mobile devices you would lose over half the opportunities for reaching a visitor.

Ignore mobile devices at your own peril.  Your SEO efforts are doomed in my humble opinion.  I don’t work on projects I dont think can succeed so I’d bow out if working on a project ignoring mobile.

Here is an example …. searching “rental homes in louisville” you’ll notice that the results come in a different order.

desktop vs mobile search results

It appears while realtor.com is considered better for desktops, zumper.com has a better mobile experience.  Its likely better laid out and faster than realtor.com so it beats that site in mobile.  Who wins here?  Zumper as more searches on mobile than desktop.  You think well…whats the big difference its one position…the difference between 3rd and 4th or 2nd and 3rd is huge.  The difference between 1st and 2nd position is more than 50%.  If this is a billion dollar industry we’re talking about potentially hundreds of millions of dollars. 

And it IS likely layout and speed.  Notice in the desktop view realtor.com has a higher trust score and a higher domain score.  Realtor.com has a higher rank, advertises more, was a domain first ….. in almost every SEO metric realtor.com should be higher than zumper.com.  BUT in mobile it is not.

Hits: 0

Political SEO (3)

I have to admit they are useful in my opinion.  Say you have 10 blogs that you do SEO for … you can use the RSS feed with an IFTTT Applet to auto generator your social media announcements of new content.  Rather than posting something to Facebook, Twitter, LinkedIn, Blogger, Tumblr, Instagram and so on and so on.

You can also ensure that your blog regularly gets updates which improve the likely hood of crawlers visiting your site more frequently.  It’s not the main attraction content that you would be pulling in but filler content that is still relevant.  Think of it like this … when you go to your local newspaper’s site likely all of the nation news is from the Associated Press.  That means that article is duplicate content but duplicate content isn’t as bad as folks make it out to be…it happens and in journalism it’s more the norm than the exception.

Sometimes I feel that RSS content aggregated well can create unique content.  An example is during an election I built a site that aimed to have all the local elections news in one spot.  VoteLouisville.com pulled the RSS feeds of anyone running for office which created a place where you could read two candidates updates side by side and compare what they were focusing on in the election.  The candidates want there content reposted, and the voters likely want to see both sides so in that scenario I think the content from RSS made uniquely different aggregated content.

Hits: 181

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1464

Political SEO is search engine optimization meant to affect a political outcome by ranking a political message for a specific intended audience before a future event.  The event is often an election.  Political SEO is commonly mixed with PPC or Pay Per Click advertising as SEO work generally requires time to build momentum while political campaigns are often short lived entities.

Political SEO may promote a candidate, party, idea, platform or agenda and intends to influence an audience’s perception of that focus.

Hits: 24

Category: Political SEO

SEO (15)

Domain Authority is a Moz.com metric meant to predict a domains ability to rank content on a search engine.  It relies heavily on backlinks and anchor text just like Google.  DA is a number between 0 to 100.  It is increasingly harder to increase your domain authority.  So a site may have 20 backlinks from 20 different sites and it earns a 10 DA but when that site has 40 domains linking to it its not going to have a 20 DA it might have a 15.

The Law Of Diminishing Returns is really easily seen in this topic.  Over a 4 month period a DA33 site increased its linking domains from 400 to 800 and its DA increased to DA40.  That was 7 points for double the domains.  This site on 2/1/2019 has a DA41.  A month ago this site had a DA40.  In the last 30 days this site added about 34 domains linking to it for a total of 854.  The other site previously mentioned had about 800 when it became a DA40 as well.

An indepth discuss is included in an article on the site.  Domain Authority, Why Should I Care?

Hits: 16

Category: SEO

are we there yet?

Likely this is the most annoying question I get but I get it…we all wanna know the future.  Anyone who tells you a set time table is lying or doesn’t realize they don’t know.  We can guestimate…and I mean guess estimate it.  Cause for us to make an accurate quote on how long it will take we have to know what other people may be doing on the same keyword in the future and often they dont even know they are going to rank for that keyword a week from now.

You have to understand with 200 factors coming together in your ranking there are like thousands with there own 200 ranking factors that are going to play into the answer.  Sometimes unrelated things like adding a huge blog post on something else with poorly optimized images can slow down the whole site and that speed decrease may be enough to move someone else a spot higher.

In broad general terms I’d say with backlink building from relevant sites using the correct anchor text, users clicking your search result more than the one above you and with some good content thats optimized you can rank at the top in less than a month for easier long tail keywords.  It its a difficult keyword that has competition surrounding it Id expect it to take longer and Id be prepared for fluctuation some days you may lose some ground don’t get upset. The thing I’d focus on is the trend…is it climbing and can you keep up that pace?  Here is a set of keywords I use to serve as a baseline for this site…I think the trend is the most important thing.

keyword ranking trend

Hits: 13

Tag: trend

Hits: 13

Category: SEO

Let me show you how important it is….

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the left?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

Zumper.com Passes Speed

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

Hits: 0

They are just longer phrases with a more exact intent by a searcher.  For instance SEO isn’t a long tail keyword.  Ultimate SEO isn’t either but its twice as close to it.  Ultimate SEO FAQs is just about there and you can see that its more and more specific… Ultimate SEO FAQs On Long Tail Keywords …. now thats a long tail keyword.

They are searched less but they are much more specific and its easier to rank well for them with the right content.

Hits: 15

Category: SEO

Theres likely debate and I don’t claim to have tested everything out there but I’ve found this mix to be good for me.

WPA Auto Links – It auto adds links in your content when it recognizes a phrase you have already covered so it adds an internal link for you.

Yoast – Hands down the best most configurable SEO plugin for what you want people to see in search results. I rarely use the keyword tool where it says to add this word here or there.  I also prefer xml site maps from a different plugin.

XML Site maps And Google News plugin just makes the files I want exactly the way I want.

Schema is another plugin that Yoast also does but I prefer a plugin that does it really well.

CyberSyn – For RSS feeds to posts, it gives the most features of any plugins I know of on WordPress for RSS to post.

Ultimate FAQs – I am a fan of FAQs all the time and anytime.  I like that in this plugin they are all on one page together but a permalink exists to the specific FAQ on its own as well giving you the flexibility of serving it a la cart.

Smush – Image optimization is essential and without a plugin to ensure you’re not showing a 2000 x 2000 image in a 200 x 200 spot you’ll quickly lose control of your site’s load times.

Merge Minify Refresh – You’ll have 25 css and 20 js files on every page and this plugin helps tackle that with out breaking everything.  I am NOT a fan of Auto-Optimize I usually find myself having to recover from backup after it auto does its thing.

404 to 301 – I like to 404 redirect to a search page that is likely to help someone find what they are after rather than just drop them on a 404 error page or the homepage.

WP-Optimize – I prefer this tool … always backup first but it does a good amount of cleaning and helps keep your database tables neat.  Neat means faster page load times.

Hits: 10

Categories: SEO, Technical SEO

Great question…here is a video from them.  It is worth watching because it provides good advice for hiring an SEO and sets expectations.

Hits: 14

Category: SEO

Hits: 14

Category: SEO

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1464

SEO Traffic

Search Engine Optimization or SEO is a broad term that covers actions intended to improve a web site or web page’s visibility within one or more search engines.  Its ever evolving as the search engines change and the way they rank is updated.

In the earliest days of the internet Keywords and Keyword stuffing rained king.  As search engines have evolved they no longer use old tools such as the meta keywords or keyword density. Keywords are still important but its their contextual use and placement that now matters.

SEO includes “off page” topics such as anchor text, backlinks and technical SEO.   In all 200 factors go into consideration by Google when ranking sites for searches and each of these factors is arguably a subsection of SEO.

SEO can further be divided by Whitehat or Blackhat with many SEOs falling into a hybrid Greyhat area.  Blackhat SEO is considered fast and risky with Whitehat meeting the requirements of a search engine and seek long term growth that may take some investment of time and money.

Hits: 13

Category: SEO

Hits: 13

Category: SEO

Auto Blogging is the practice of increasing content across sites usually with the same content but sometimes with subtle changes in the wording.  It’s done usually to pad content and it’s similar to syndicating news or guest postings that may appear on other sites.

Whats the issue with it?  Well it cant be the most unique, fresh content if you’re auto blogging it.  Even with the use of synonyms and word spinners the content is below grade and could carry duplicate content.  You should ensure that the content isn’t plagiarized and give credit to the author.  Canonical tags begin to play into this more where you get the use of the content but you provide the SEO value and credit back to the author.

Whats wrong with spinner content?  Its not natural … consider the sentence … Its the day after Valentine’s Day here and I’m just hanging out with my dog.  I got back “Its the after a long time after Valentine’s Day here and I’m simply spending time with my puppy.” from Free Article Spinner.com.  There is a mishap in the beginning of the sentence now and now “I’m simply spending time with my puppy.” … how would it not be simply?  Or how about another test sentence…

I prefer doing technical SEO audits to local SEO work just because Dallas, Houston and Los Angeles are all very different from Louisville or Chicago.

I incline toward doing specialized SEO reviews to nearby SEO work since Dallas, Houston and Los Angeles are for the most part altogether different from Louisville or Chicago.

Get the point on Auto Blogging?  I’d say you’re better off rewriting it yourself and now its not auto blogging.

Hits: 10

Categories: SEO, SEO Content

James Cameron does what James Cameron does because James Cameron is James Cameron.

That is keyword stuffing.  It’s easy to spot and pretty obvious.  Think of a shameless self promotional tactic that just forces a keyword needless into a page’s content.  It distracts visitors at best and at worst it hurts your SEO value.

Hits: 13

Category: SEO

In the early 2000s.  Page Rank isn’t a publicly available factor anymore according to Google.  You can monitor your page and domain using other metrics such as Domain Authority, Citation Flow and Trust Flow.  These metrics are 3rd party measures of what they expect Google to see your site as for ranking keywords.  They are not Google’s metrics.  There are additional competing metrics from other sites as well and you should be aware of them all.  Test them out and decide for yourself which numbers work best for you.

Ultimate SEO uses Domain Authority and Trust Flow along with Spam Score to help build a base depectition of a page’s ability to rank a keyword.

Hits: 14

Category: SEO

Keywords first and foremost should be located where they make sense.  Don’t force them into a place just because you want that keyword.

With that said, Keywords should be in the page title, description, Img Alt tags, headers, and a few times in the content.  Since each page needs to be unique I recommend focusing on a main keyword per page or post. That way you can focus on that keyword and all of the content on the page is relevant to the specific keyword.

Hits: 13

Category: SEO

SEO Audits (8)

Best plan is to hire me to do a Technical Audit and let me implement my recommendations.  Too easy right?  Okay…so then you need to establish a baseline go to gtmetrix.com and put in a targeted page you’d like to optimize.

gtmetrix technical seo tool

Okay…so I’m next to perfect here but it wasnt always that way.  I used to have a slider that had 5 large images in it at the top of my homepage.  That slider required a dozen or so javascript files and css files to load.  Since it was the first thing you saw it meant you had to wait for it before the page began to function.  After that the page had images and more javascript and more css files.  Each file on a page is a request… your computer has to ask the server for it, the server may take a moment to respond with the file, then you have to put it in place on the page and multiple that process by a hundred and you see a chance to improve your site speed.

Additionally combining files cuts down on the time required.  I have 8 or so certification icons on the homepage but I merged them into 2 images instead of 8.  Here is an example…

technical certifications

We can also improve Technical load times by getting a server that is dedicated to this site.  You might try a micro server on AWS or Google Cloud Platform they cost hardly anything but the resources are available for your site when you need them.  Cloudflare also offers speed advantageous through a cache and its CDN network.  Theres more but these items should be enough to start you on the road to at most a 3 second load time.  Thats honestly where you have to draw the line if you want to rank a keyword against competitors who are willing to do what it takes to load within 3 seconds.

Hits: 13

are we there yet?

Likely this is the most annoying question I get but I get it…we all wanna know the future.  Anyone who tells you a set time table is lying or doesn’t realize they don’t know.  We can guestimate…and I mean guess estimate it.  Cause for us to make an accurate quote on how long it will take we have to know what other people may be doing on the same keyword in the future and often they dont even know they are going to rank for that keyword a week from now.

You have to understand with 200 factors coming together in your ranking there are like thousands with there own 200 ranking factors that are going to play into the answer.  Sometimes unrelated things like adding a huge blog post on something else with poorly optimized images can slow down the whole site and that speed decrease may be enough to move someone else a spot higher.

In broad general terms I’d say with backlink building from relevant sites using the correct anchor text, users clicking your search result more than the one above you and with some good content thats optimized you can rank at the top in less than a month for easier long tail keywords.  It its a difficult keyword that has competition surrounding it Id expect it to take longer and Id be prepared for fluctuation some days you may lose some ground don’t get upset. The thing I’d focus on is the trend…is it climbing and can you keep up that pace?  Here is a set of keywords I use to serve as a baseline for this site…I think the trend is the most important thing.

keyword ranking trend

Hits: 13

Tag: trend

This is a solid reasonable question…that unfortunately requires a sliding scale answer.  Since your Google ranking is made up of 200 factors and Domain Pop is just one of these no exact number can be used to determine your rank. First…whats the quality of the ranking domains?  100 no name domains that you just created aren’t worth one link from huffingtonpost.com

I would suggest there is a rough rule of thumb…

DA 10 = less than 50

DA 20 = 200

DA 30 = 400

DA40 = 800

DA50 = 2000

 

These are not exact numbers but just rules of thumb that I’ve found helpful.  Basically though for every ten more points in DA the number of domains may double. PBN sites and domains with little to no traffic are maybe only worth a fourth of a domain…so if yiu have 800 pbn sites linking to you ….Id say you may expect a DA of about 10 to 20.  If these 800 are solid real domains, youre DA should be 35 – 40.

 

 

Hits: 15

Quick Answer: Yes

Long Answer: Yes, they don’t do the same thing…Google wouldn’t have made a tool twice.

Google Analytics tracks your visitors and their experience and behavior on your site regardless of where they came from or how they got there. That means Bing traffic is counted, it means traffic attainted directly, someone types the address in the address bar … they are counted as well.

Google Search Console: Tracks the queries by searchers and how you ranked for those and includes data on your query performance when you weren’t selected from the results.

  • Google Analytics tracks visitors to the site no matter where they came from.
  • Google Search Console – tracks the queries you were displayed for and how often Google Searchers selected you as the result they wanted for that query.

So no Bing traffic in Google Search Console and no keywords and position ranking for anyone who didn’t go to your site in Google Analytics.

Two products, two different audiences, two different uses …. different data.

For fun and for transparency I often use Google Data Studio to pull those two products in together for ya’ll.

You can tie the two together if you’d like data shared…


Access Search Console data in Google Analytics

This feature is currently supported only in old Search ConsoleOpen old Search Console now.

If you associate a Google Analytics property with a site in your Search Console account, you’ll be able to see Search Console data in your Google Analytics reports. You’ll also be able to access Google Analytics reports directly from the Links to your site, and Sitelinks pages in Search Console. Note that you can only associate a website; you cannot associate an app.

Associate PropertiesYou must be an owner of the Google Analytics property to be able to associate it with a website in Search Console.

You can open the Google Analytics association page from the  property settings dropdown in Search Console.

When you associate a site in your Search Console account with a Google Analytics property, by default Search Console data is enabled for all profiles associated with that property. As a result, anybody with access to that Google Analytics property may be able to see Search Console data for that site. For example, if a Google Analytics admin adds a user to a profile, that user may be able to see Search Console data in Search Optimization reports.

A site can be associated with only one property, and vice versa. Creating a new association removes the previously existing association.

Every Google Analytics property can have a number of views. When you associate a site with a property, clicking a link to Google Analytics from Search Console will take you to that property’s default view. (If you previously associated your site with a different view, clicking a link will now take you to the default view instead. If you want to see a different view, you can switch views from within Google Analytics.)

If your site is already associated with a Google Analytics property, it could be for a couple of reasons. Either you already used Google Analytics to associate this property with the site, or another site owner has made the association.

If your site is associated with an Analytics property you don’t recognize (and don’t want), it may be because another site owner associated the site with an Analytics property you don’t own. In this case, you can delete the association and create a new one.

If your site used to be associated with a property, but no longer is, it may be that the property was later associated with a different site. (Remember, a site can be associated with only one property. Creating a new association will remove the previously existing association.)

You can also create association using the Analytics admin page if you’re an account administrator for the Google Analytics property.Google Analytics account administrators can move their Analytics property from one Analytics account to another. Any associated Search Console properties will maintain their association as part of the move. After the move, any users of the new Analytics account will be able to see data from the associated Search Console property without a notification in Search Console. Learn more.

Removing Search Console data from Google Analytics

To remove Search Console data from a Google Analytics property, unlink the association using Search Console’s association page, or manage association in the  Analytics admin page (if you’re an administrator for the Google Analytics property).

Why doesn’t Search Console data match Google Analytics data?

Search Console data may differ from the data displayed in other tools, such as Google Analytics. Possible reasons for this include:

  • Search Console does some additional data processing—for example, to handle duplicate content and visits from robots—that may cause your stats to differ from stats listed in other sources.
  • Some tools, such as Google Analytics, track traffic only from users who have enabled JavaScript in their browser.
  • Google Analytics tracks visits only to pages which include the correctly configured Analytics Javascript code. If pages on the site don’t have the code, Analytics will not track visits to those pages. Visits to pages without the Analytics tracking code will, however, be tracked in Search Console if users reach them via search results or if Google crawls or otherwise discovers them.
  • Some tools define “keywords” differently. For example:
    • The Keywords page in Search Console displays the most significant words Google found on your site.
    • The Keywords tool in Google Adwords displays the total number of user queries for that keyword across the web.
    • Analytics uses the term “keywords” to describe both search engine queries and Google Ads paid keywords.
    • The Search Console Search Analytics page lists shows the total number of keyword search queries in which your page’s listing was seen in search results, and this is a smaller number. Also, Search Console rounds search query data to one or two significant digits.

Hits: 14

Let me show you how important it is….

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the left?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

Zumper.com Passes Speed

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

Hits: 0

First off …. honestly take pretty much everything you know about duplicate content and forget it.  You will have duplicate content and thats alright…its normal.  The Associated Press is a highly syndicated news source…do you think Google penalizes every news site that includes the AP content?

Think I’m talking crazy?  Look for information from Google on Duplicate Content…ignore info from anyone else.  To save you a step…heres me duplicating content….copying from Google https://support.google.com/webmasters/answer/66359?hl=en

 

Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin. Examples of non-malicious duplicate content could include:

  • Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices
  • Store items shown or linked via multiple distinct URLs
  • Printer-only versions of web pages

If your site contains multiple pages with largely identical content, there are a number of ways you can indicate your preferred URL to Google. (This is called “canonicalization”.) More information about canonicalization.

However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.

Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has a “regular” and “printer” version of each article, and neither of these is blocked with a noindex meta tag, we’ll choose one of them to list. In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.

There are some steps you can take to proactively address duplicate content issues, and ensure that visitors see the content you want them to.

    • Use 301s: If you’ve restructured your site, use 301 redirects (“RedirectPermanent”) in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.)

 

    • Be consistent: Try to keep your internal linking consistent. For example, don’t link to http://www.example.com/page/ and http://www.example.com/page and http://www.example.com/page/index.htm.

 

    • Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We’re more likely to know that http://www.example.de contains Germany-focused content, for instance, than http://www.example.com/de or http://de.example.com.

 

  • Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.

Hits: 13

Category: SEO Audits

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1464

Volume and because Google says it is.  That’s honestly the best answer…now for the why….

  • More than half of the searches on Google are from mobile devices.  If you only think in terms of searching from a desktop you ignore the majority of searches.  A mobile device usually displays content in a small limited width view, desktop pages are not appropriate for a good mobile experience.  Mobile’s load generally slower as they lack a physical connection to a network.  Pages need to load fast and thus need to be streamlined for the best experience, page load speed is therefore extremely important.
  • Mobile First – Is an initiative Google started in 2018.  The name is self explanatory.  If you don’t want to put mobile first you’ll ignore the greatest customer you have….Google.
  • Mobile Index – to address the disparity between the mobile experience and a desktop experience Google created a second index of results thats independent of the desktop results.  You could be the number one result in desktop but not listed in mobile.  Since we noted more than half of searches are on mobile devices you would lose over half the opportunities for reaching a visitor.

Ignore mobile devices at your own peril.  Your SEO efforts are doomed in my humble opinion.  I don’t work on projects I dont think can succeed so I’d bow out if working on a project ignoring mobile.

Here is an example …. searching “rental homes in louisville” you’ll notice that the results come in a different order.

desktop vs mobile search results

It appears while realtor.com is considered better for desktops, zumper.com has a better mobile experience.  Its likely better laid out and faster than realtor.com so it beats that site in mobile.  Who wins here?  Zumper as more searches on mobile than desktop.  You think well…whats the big difference its one position…the difference between 3rd and 4th or 2nd and 3rd is huge.  The difference between 1st and 2nd position is more than 50%.  If this is a billion dollar industry we’re talking about potentially hundreds of millions of dollars. 

And it IS likely layout and speed.  Notice in the desktop view realtor.com has a higher trust score and a higher domain score.  Realtor.com has a higher rank, advertises more, was a domain first ….. in almost every SEO metric realtor.com should be higher than zumper.com.  BUT in mobile it is not.

Hits: 0

SEO Backlinks (17)

301 Redirect

301 Redirects are used when content that was located at one web address or URL is moved to another web address.  Its useful to ensure traffic is still served when using the old web address.  Consider:

A product page for for rat poison and what to do if accidentally ingested is available at the company’s website.  Poison Control googles that poison and finds the page with the instructions and bookmarks it for future reference. The company is bought out by a larger company who will rebrand the poison.  When it is rebranded the original site will be turned off.  Without a 301 redirect the poison control bookmark would just go to an error. With a 301 redirect the old link is redirected to the new site’s content.

302 Redirect

Is the same thing as a 301 Redirect but is temporary. If a search engine crawls a page and sees a 301 redirect they know to change the listing in their results to the new target.  A 302 redirect tells them not to make a change as this is only temporary. They are used less than 301 redirects.

 

Hits: 14

Category: SEO Backlinks

Hits: 13

Category: SEO Backlinks

Hits: 14

Category: SEO Backlinks

Answer: Does a bear shit in the woods?

PBNs are extremely effective and completely relevant and useful and used in 2019.

do pbns work still?

 PBNs in 2019

They may look different and work slightly different from when an SEO used a 10 site network in 2005…but they work.  If they didn’t work we wouldn’t have Google trying to destroy them.

Hits: 14

Category: SEO Backlinks
Tag: PBN SEO

Hits: 13

Category: SEO Backlinks

Hits: 13

Category: SEO Backlinks

are we there yet?

Likely this is the most annoying question I get but I get it…we all wanna know the future.  Anyone who tells you a set time table is lying or doesn’t realize they don’t know.  We can guestimate…and I mean guess estimate it.  Cause for us to make an accurate quote on how long it will take we have to know what other people may be doing on the same keyword in the future and often they dont even know they are going to rank for that keyword a week from now.

You have to understand with 200 factors coming together in your ranking there are like thousands with there own 200 ranking factors that are going to play into the answer.  Sometimes unrelated things like adding a huge blog post on something else with poorly optimized images can slow down the whole site and that speed decrease may be enough to move someone else a spot higher.

In broad general terms I’d say with backlink building from relevant sites using the correct anchor text, users clicking your search result more than the one above you and with some good content thats optimized you can rank at the top in less than a month for easier long tail keywords.  It its a difficult keyword that has competition surrounding it Id expect it to take longer and Id be prepared for fluctuation some days you may lose some ground don’t get upset. The thing I’d focus on is the trend…is it climbing and can you keep up that pace?  Here is a set of keywords I use to serve as a baseline for this site…I think the trend is the most important thing.

keyword ranking trend

Hits: 13

Tag: trend

This is a solid reasonable question…that unfortunately requires a sliding scale answer.  Since your Google ranking is made up of 200 factors and Domain Pop is just one of these no exact number can be used to determine your rank. First…whats the quality of the ranking domains?  100 no name domains that you just created aren’t worth one link from huffingtonpost.com

I would suggest there is a rough rule of thumb…

DA 10 = less than 50

DA 20 = 200

DA 30 = 400

DA40 = 800

DA50 = 2000

 

These are not exact numbers but just rules of thumb that I’ve found helpful.  Basically though for every ten more points in DA the number of domains may double. PBN sites and domains with little to no traffic are maybe only worth a fourth of a domain…so if yiu have 800 pbn sites linking to you ….Id say you may expect a DA of about 10 to 20.  If these 800 are solid real domains, youre DA should be 35 – 40.

 

 

Hits: 15

Quick Answer: Yes

Long Answer: Yes, they don’t do the same thing…Google wouldn’t have made a tool twice.

Google Analytics tracks your visitors and their experience and behavior on your site regardless of where they came from or how they got there. That means Bing traffic is counted, it means traffic attainted directly, someone types the address in the address bar … they are counted as well.

Google Search Console: Tracks the queries by searchers and how you ranked for those and includes data on your query performance when you weren’t selected from the results.

  • Google Analytics tracks visitors to the site no matter where they came from.
  • Google Search Console – tracks the queries you were displayed for and how often Google Searchers selected you as the result they wanted for that query.

So no Bing traffic in Google Search Console and no keywords and position ranking for anyone who didn’t go to your site in Google Analytics.

Two products, two different audiences, two different uses …. different data.

For fun and for transparency I often use Google Data Studio to pull those two products in together for ya’ll.

You can tie the two together if you’d like data shared…


Access Search Console data in Google Analytics

This feature is currently supported only in old Search ConsoleOpen old Search Console now.

If you associate a Google Analytics property with a site in your Search Console account, you’ll be able to see Search Console data in your Google Analytics reports. You’ll also be able to access Google Analytics reports directly from the Links to your site, and Sitelinks pages in Search Console. Note that you can only associate a website; you cannot associate an app.

Associate PropertiesYou must be an owner of the Google Analytics property to be able to associate it with a website in Search Console.

You can open the Google Analytics association page from the  property settings dropdown in Search Console.

When you associate a site in your Search Console account with a Google Analytics property, by default Search Console data is enabled for all profiles associated with that property. As a result, anybody with access to that Google Analytics property may be able to see Search Console data for that site. For example, if a Google Analytics admin adds a user to a profile, that user may be able to see Search Console data in Search Optimization reports.

A site can be associated with only one property, and vice versa. Creating a new association removes the previously existing association.

Every Google Analytics property can have a number of views. When you associate a site with a property, clicking a link to Google Analytics from Search Console will take you to that property’s default view. (If you previously associated your site with a different view, clicking a link will now take you to the default view instead. If you want to see a different view, you can switch views from within Google Analytics.)

If your site is already associated with a Google Analytics property, it could be for a couple of reasons. Either you already used Google Analytics to associate this property with the site, or another site owner has made the association.

If your site is associated with an Analytics property you don’t recognize (and don’t want), it may be because another site owner associated the site with an Analytics property you don’t own. In this case, you can delete the association and create a new one.

If your site used to be associated with a property, but no longer is, it may be that the property was later associated with a different site. (Remember, a site can be associated with only one property. Creating a new association will remove the previously existing association.)

You can also create association using the Analytics admin page if you’re an account administrator for the Google Analytics property.Google Analytics account administrators can move their Analytics property from one Analytics account to another. Any associated Search Console properties will maintain their association as part of the move. After the move, any users of the new Analytics account will be able to see data from the associated Search Console property without a notification in Search Console. Learn more.

Removing Search Console data from Google Analytics

To remove Search Console data from a Google Analytics property, unlink the association using Search Console’s association page, or manage association in the  Analytics admin page (if you’re an administrator for the Google Analytics property).

Why doesn’t Search Console data match Google Analytics data?

Search Console data may differ from the data displayed in other tools, such as Google Analytics. Possible reasons for this include:

  • Search Console does some additional data processing—for example, to handle duplicate content and visits from robots—that may cause your stats to differ from stats listed in other sources.
  • Some tools, such as Google Analytics, track traffic only from users who have enabled JavaScript in their browser.
  • Google Analytics tracks visits only to pages which include the correctly configured Analytics Javascript code. If pages on the site don’t have the code, Analytics will not track visits to those pages. Visits to pages without the Analytics tracking code will, however, be tracked in Search Console if users reach them via search results or if Google crawls or otherwise discovers them.
  • Some tools define “keywords” differently. For example:
    • The Keywords page in Search Console displays the most significant words Google found on your site.
    • The Keywords tool in Google Adwords displays the total number of user queries for that keyword across the web.
    • Analytics uses the term “keywords” to describe both search engine queries and Google Ads paid keywords.
    • The Search Console Search Analytics page lists shows the total number of keyword search queries in which your page’s listing was seen in search results, and this is a smaller number. Also, Search Console rounds search query data to one or two significant digits.

Hits: 14

Backlinks are links from other sites.  Think of them as votes of affirmation.  One vote can come from a domain so for SEO purposes it doesn’t matter if there are 100 or 1 link from the same domain one link is the count you gain.  Now those other links may increase traffic to your site but in regards to SEO its one vote.

The more domains that link to you the more authoritative you must be right?  Well Kinda.  If 1000 domains link to your site you likely are more authoritative than a site that 3 sites link too.  Not all domains or votes or backlinks….are the same.  A link to your site from UltimateSEO.org carries with it the weigh attributed to that site by its backlinks.  Different companies refer to the authority of a site differently.

 

You can see a site’s backlinks from many indexes most being paid.  Ultimate SEO recommends Monitor Backlinks if you want a tool that is really good at backlinks.  UltimateSEO received nothing for that endorsement.  The endorsement or vote …. as you see is a backlink.

Hits: 16

Category: SEO Backlinks

Hits: 18

Category: SEO Backlinks

Hits: 13

Category: SEO Backlinks

Hits: 14

Category: SEO Backlinks

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1464

Google provides us with a tool they call the Disavow Tool.  It allows websites to list backlinks that they don’t want counting as part of their site’s backlinks.  Some backlinks may harm your SEO efforts and are considered “spammy.”  But Google has said that they are pretty good now at realizing a poor quality backlink and not using it so you may not have to list these links at all.

You should only use the Disavow Tool if you know your site is being penalized.  You can find this out from Google Search Console.  Doing so prior to any issue may hurt your SEO if you disavow links that Google thought were good.

Hits: 16

Category: SEO Backlinks

Backlink Indexers are simply sites such as indexkings.com that claim to get your backlinks found by Google faster.  The jury is out on how useful they actually are but essentially they place backlinks to your backlinks.  Theory being if you have one page with a link you want Google to find you can do a lot better with 800 links to that backlink page.  Now it sounds reasonable but is 800 or 1 any different when we are talking about zillions of links on the web?

Ive used both free and paid indexers and ultimately I saw no real marked difference in Google indexing the page I had a backlink sitting on.  With that said please share opinions that are supportive if you disagree.  BTW…indexkings.com I linked above is a free site, so maybe you can create a study for us only costing you time.

Final thought, if you receive a good backlink its likely a popular site and Google will find it without issue.  If you paid for a link on some obscure site then sure indexing may be worth it?

Hits: 10

Category: SEO Backlinks
Tag: Indexer

Hits: 13

Category: SEO Backlinks

SEO Content (3)

I have to admit they are useful in my opinion.  Say you have 10 blogs that you do SEO for … you can use the RSS feed with an IFTTT Applet to auto generator your social media announcements of new content.  Rather than posting something to Facebook, Twitter, LinkedIn, Blogger, Tumblr, Instagram and so on and so on.

You can also ensure that your blog regularly gets updates which improve the likely hood of crawlers visiting your site more frequently.  It’s not the main attraction content that you would be pulling in but filler content that is still relevant.  Think of it like this … when you go to your local newspaper’s site likely all of the nation news is from the Associated Press.  That means that article is duplicate content but duplicate content isn’t as bad as folks make it out to be…it happens and in journalism it’s more the norm than the exception.

Sometimes I feel that RSS content aggregated well can create unique content.  An example is during an election I built a site that aimed to have all the local elections news in one spot.  VoteLouisville.com pulled the RSS feeds of anyone running for office which created a place where you could read two candidates updates side by side and compare what they were focusing on in the election.  The candidates want there content reposted, and the voters likely want to see both sides so in that scenario I think the content from RSS made uniquely different aggregated content.

Hits: 181

Yes … already, come on!  An interesting thing happened once on a blog post I did while the article was on page two the image I used in the post and included a good img-alt tag for ended up ranking #1 for that keyword in image search.  It’s just less competitive and the point is that on page one a few images are usually shown and with the right image you can pull in more clicks than your page two article.

It’s also a requirement.  You see those are needed t=for screen readers to read to a blind person.Its part of your sites accessibility to people with ADA needs.  Which in the end a web crawler is like a blind person, so use the tag to show you’ve added yet another supporting element on the page for the keyword you’re after … and often others don’t use those tags so its your secret weapon sometimes.

Citation Flow  I used this image to convey Citation Flow’s uselessness and the img alt tag supports that.

Hits: 11

Category: SEO Content

Auto Blogging is the practice of increasing content across sites usually with the same content but sometimes with subtle changes in the wording.  It’s done usually to pad content and it’s similar to syndicating news or guest postings that may appear on other sites.

Whats the issue with it?  Well it cant be the most unique, fresh content if you’re auto blogging it.  Even with the use of synonyms and word spinners the content is below grade and could carry duplicate content.  You should ensure that the content isn’t plagiarized and give credit to the author.  Canonical tags begin to play into this more where you get the use of the content but you provide the SEO value and credit back to the author.

Whats wrong with spinner content?  Its not natural … consider the sentence … Its the day after Valentine’s Day here and I’m just hanging out with my dog.  I got back “Its the after a long time after Valentine’s Day here and I’m simply spending time with my puppy.” from Free Article Spinner.com.  There is a mishap in the beginning of the sentence now and now “I’m simply spending time with my puppy.” … how would it not be simply?  Or how about another test sentence…

I prefer doing technical SEO audits to local SEO work just because Dallas, Houston and Los Angeles are all very different from Louisville or Chicago.

I incline toward doing specialized SEO reviews to nearby SEO work since Dallas, Houston and Los Angeles are for the most part altogether different from Louisville or Chicago.

Get the point on Auto Blogging?  I’d say you’re better off rewriting it yourself and now its not auto blogging.

Hits: 10

Categories: SEO, SEO Content

Technical SEO (7)

Best plan is to hire me to do a Technical Audit and let me implement my recommendations.  Too easy right?  Okay…so then you need to establish a baseline go to gtmetrix.com and put in a targeted page you’d like to optimize.

gtmetrix technical seo tool

Okay…so I’m next to perfect here but it wasnt always that way.  I used to have a slider that had 5 large images in it at the top of my homepage.  That slider required a dozen or so javascript files and css files to load.  Since it was the first thing you saw it meant you had to wait for it before the page began to function.  After that the page had images and more javascript and more css files.  Each file on a page is a request… your computer has to ask the server for it, the server may take a moment to respond with the file, then you have to put it in place on the page and multiple that process by a hundred and you see a chance to improve your site speed.

Additionally combining files cuts down on the time required.  I have 8 or so certification icons on the homepage but I merged them into 2 images instead of 8.  Here is an example…

technical certifications

We can also improve Technical load times by getting a server that is dedicated to this site.  You might try a micro server on AWS or Google Cloud Platform they cost hardly anything but the resources are available for your site when you need them.  Cloudflare also offers speed advantageous through a cache and its CDN network.  Theres more but these items should be enough to start you on the road to at most a 3 second load time.  Thats honestly where you have to draw the line if you want to rank a keyword against competitors who are willing to do what it takes to load within 3 seconds.

Hits: 13

page speed 3 second rule

All of your sites pages should load within three seconds.  After five seconds the number of people still waiting for your page to load is about half of those who started. As your page takes longer to load, Google lowers your rank so it can serve visitors results that they’ll use.

A Lot of people discard this element of Technical SEO and a lot of people aren’t going to rank for the keywords they want.   Optimize your page content and server hardware to ensure pages are visible with in three seconds.

Test the top results for the keyword you’re hoping to rank and I can pretty much guarantee it loaded in under 3 seconds.  The top result for Barack Obama was his Wikipedia page. It loads in 2.8 seconds according to GTmetrix.   The top result for Ultimate SEO is SEO Ultimate Plus and they load in 3.9 seconds.  Our site loads in 3.4 seconds.

Page speed is one of 200 factors that go into Google ranking.  Its one of the easiest to change though.

 

Hits: 16

Category: Technical SEO

Let me show you how important it is….

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the left?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

Zumper.com Passes Speed

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

Hits: 0

Theres likely debate and I don’t claim to have tested everything out there but I’ve found this mix to be good for me.

WPA Auto Links – It auto adds links in your content when it recognizes a phrase you have already covered so it adds an internal link for you.

Yoast – Hands down the best most configurable SEO plugin for what you want people to see in search results. I rarely use the keyword tool where it says to add this word here or there.  I also prefer xml site maps from a different plugin.

XML Site maps And Google News plugin just makes the files I want exactly the way I want.

Schema is another plugin that Yoast also does but I prefer a plugin that does it really well.

CyberSyn – For RSS feeds to posts, it gives the most features of any plugins I know of on WordPress for RSS to post.

Ultimate FAQs – I am a fan of FAQs all the time and anytime.  I like that in this plugin they are all on one page together but a permalink exists to the specific FAQ on its own as well giving you the flexibility of serving it a la cart.

Smush – Image optimization is essential and without a plugin to ensure you’re not showing a 2000 x 2000 image in a 200 x 200 spot you’ll quickly lose control of your site’s load times.

Merge Minify Refresh – You’ll have 25 css and 20 js files on every page and this plugin helps tackle that with out breaking everything.  I am NOT a fan of Auto-Optimize I usually find myself having to recover from backup after it auto does its thing.

404 to 301 – I like to 404 redirect to a search page that is likely to help someone find what they are after rather than just drop them on a 404 error page or the homepage.

WP-Optimize – I prefer this tool … always backup first but it does a good amount of cleaning and helps keep your database tables neat.  Neat means faster page load times.

Hits: 10

Categories: SEO, Technical SEO

Content Delivery Networks are used to speed up sites by removing the bottleneck created by having one server provide files. These added servers are then spread out over a geographical area bringing the files closer to end users. This network could be global or regional.

Cloudflare is an example of a CDN as well as Jetpack. A popular paid CDN is AWS’s Cloudfront. These servers are called Edge servers as they are on the edge of the companies network all over.

Hits: 0

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1464

Volume and because Google says it is.  That’s honestly the best answer…now for the why….

  • More than half of the searches on Google are from mobile devices.  If you only think in terms of searching from a desktop you ignore the majority of searches.  A mobile device usually displays content in a small limited width view, desktop pages are not appropriate for a good mobile experience.  Mobile’s load generally slower as they lack a physical connection to a network.  Pages need to load fast and thus need to be streamlined for the best experience, page load speed is therefore extremely important.
  • Mobile First – Is an initiative Google started in 2018.  The name is self explanatory.  If you don’t want to put mobile first you’ll ignore the greatest customer you have….Google.
  • Mobile Index – to address the disparity between the mobile experience and a desktop experience Google created a second index of results thats independent of the desktop results.  You could be the number one result in desktop but not listed in mobile.  Since we noted more than half of searches are on mobile devices you would lose over half the opportunities for reaching a visitor.

Ignore mobile devices at your own peril.  Your SEO efforts are doomed in my humble opinion.  I don’t work on projects I dont think can succeed so I’d bow out if working on a project ignoring mobile.

Here is an example …. searching “rental homes in louisville” you’ll notice that the results come in a different order.

desktop vs mobile search results

It appears while realtor.com is considered better for desktops, zumper.com has a better mobile experience.  Its likely better laid out and faster than realtor.com so it beats that site in mobile.  Who wins here?  Zumper as more searches on mobile than desktop.  You think well…whats the big difference its one position…the difference between 3rd and 4th or 2nd and 3rd is huge.  The difference between 1st and 2nd position is more than 50%.  If this is a billion dollar industry we’re talking about potentially hundreds of millions of dollars. 

And it IS likely layout and speed.  Notice in the desktop view realtor.com has a higher trust score and a higher domain score.  Realtor.com has a higher rank, advertises more, was a domain first ….. in almost every SEO metric realtor.com should be higher than zumper.com.  BUT in mobile it is not.

Hits: 0

Load More

Hits: 101