Home

These FAQs are always expanding and gaining topics. We encourage you to ask an SEO question if it isnt already included. Just visit submit an SEO FAQ and we’ll respond to you and if it seems relevant to many we’ll add it here.

SEO Audits (8)

Best plan is to hire me to do a Technical Audit and let me implement my recommendations.  Too easy right?  Okay…so then you need to establish a baseline go to gtmetrix.com and put in a targeted page you’d like to optimize.

gtmetrix technical seo tool

Okay…so I’m next to perfect here but it wasnt always that way.  I used to have a slider that had 5 large images in it at the top of my homepage.  That slider required a dozen or so javascript files and css files to load.  Since it was the first thing you saw it meant you had to wait for it before the page began to function.  After that the page had images and more javascript and more css files.  Each file on a page is a request… your computer has to ask the server for it, the server may take a moment to respond with the file, then you have to put it in place on the page and multiple that process by a hundred and you see a chance to improve your site speed.

Additionally combining files cuts down on the time required.  I have 8 or so certification icons on the homepage but I merged them into 2 images instead of 8.  Here is an example…

technical certifications

We can also improve Technical load times by getting a server that is dedicated to this site.  You might try a micro server on AWS or Google Cloud Platform they cost hardly anything but the resources are available for your site when you need them.  Cloudflare also offers speed advantageous through a cache and its CDN network.  Theres more but these items should be enough to start you on the road to at most a 3 second load time.  Thats honestly where you have to draw the line if you want to rank a keyword against competitors who are willing to do what it takes to load within 3 seconds.

Hits: 13

are we there yet?

Likely this is the most annoying question I get but I get it…we all wanna know the future.  Anyone who tells you a set time table is lying or doesn’t realize they don’t know.  We can guestimate…and I mean guess estimate it.  Cause for us to make an accurate quote on how long it will take we have to know what other people may be doing on the same keyword in the future and often they dont even know they are going to rank for that keyword a week from now.

You have to understand with 200 factors coming together in your ranking there are like thousands with there own 200 ranking factors that are going to play into the answer.  Sometimes unrelated things like adding a huge blog post on something else with poorly optimized images can slow down the whole site and that speed decrease may be enough to move someone else a spot higher.

In broad general terms I’d say with backlink building from relevant sites using the correct anchor text, users clicking your search result more than the one above you and with some good content thats optimized you can rank at the top in less than a month for easier long tail keywords.  It its a difficult keyword that has competition surrounding it Id expect it to take longer and Id be prepared for fluctuation some days you may lose some ground don’t get upset. The thing I’d focus on is the trend…is it climbing and can you keep up that pace?  Here is a set of keywords I use to serve as a baseline for this site…I think the trend is the most important thing.

keyword ranking trend

Hits: 13

This is a solid reasonable question…that unfortunately requires a sliding scale answer.  Since your Google ranking is made up of 200 factors and Domain Pop is just one of these no exact number can be used to determine your rank. First…whats the quality of the ranking domains?  100 no name domains that you just created aren’t worth one link from huffingtonpost.com

I would suggest there is a rough rule of thumb…

DA 10 = less than 50

DA 20 = 200

DA 30 = 400

DA40 = 800

DA50 = 2000

 

These are not exact numbers but just rules of thumb that I’ve found helpful.  Basically though for every ten more points in DA the number of domains may double. PBN sites and domains with little to no traffic are maybe only worth a fourth of a domain…so if yiu have 800 pbn sites linking to you ….Id say you may expect a DA of about 10 to 20.  If these 800 are solid real domains, youre DA should be 35 – 40.

 

 

Hits: 15

Hits: 13

Let me show you how important it is….

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the left?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

Zumper.com Passes Speed

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

Hits: 0

First off …. honestly take pretty much everything you know about duplicate content and forget it.  You will have duplicate content and thats alright…its normal.  The Associated Press is a highly syndicated news source…do you think Google penalizes every news site that includes the AP content?

Think I’m talking crazy?  Look for information from Google on Duplicate Content…ignore info from anyone else.  To save you a step…heres me duplicating content….copying from Google https://support.google.com/webmasters/answer/66359?hl=en

 

Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin. Examples of non-malicious duplicate content could include:

  • Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices
  • Store items shown or linked via multiple distinct URLs
  • Printer-only versions of web pages

If your site contains multiple pages with largely identical content, there are a number of ways you can indicate your preferred URL to Google. (This is called “canonicalization”.) More information about canonicalization.

However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.

Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has a “regular” and “printer” version of each article, and neither of these is blocked with a noindex meta tag, we’ll choose one of them to list. In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.

There are some steps you can take to proactively address duplicate content issues, and ensure that visitors see the content you want them to.

    • Use 301s: If you’ve restructured your site, use 301 redirects (“RedirectPermanent”) in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.)

 

    • Be consistent: Try to keep your internal linking consistent. For example, don’t link to http://www.example.com/page/ and http://www.example.com/page and http://www.example.com/page/index.htm.

 

    • Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We’re more likely to know that http://www.example.de contains Germany-focused content, for instance, than http://www.example.com/de or http://de.example.com.

 

  • Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.

Hits: 13

Category: SEO Audits

Negative SEO is a branch of SEO that largely came into existence with Google’s Penguin Update.  For the first time in SEO a search engine was prepared to lower your ranking based on what it perceived to be a site gaming the system.  Not only did site’s change the way they promoted themselves but this created a very real opportunity to harm competitor’s sites.

All you have to do to is mimic spammy promotion of a site to draw Google into penalizing that site.  This can include:

  • Building too many unnatural backlinks from low quality sites too quickly to have been organic
  • Using anchor text with adult themes or words associated with low brow tactics.
  • Associate the competitor or target site with link building schemes, such as a PBN.
  • Duplicate content thereby watering down the value of their original content.

Google On Negative SEO

Hits: 1464

Volume and because Google says it is.  That’s honestly the best answer…now for the why….

  • More than half of the searches on Google are from mobile devices.  If you only think in terms of searching from a desktop you ignore the majority of searches.  A mobile device usually displays content in a small limited width view, desktop pages are not appropriate for a good mobile experience.  Mobile’s load generally slower as they lack a physical connection to a network.  Pages need to load fast and thus need to be streamlined for the best experience, page load speed is therefore extremely important.
  • Mobile First – Is an initiative Google started in 2018.  The name is self explanatory.  If you don’t want to put mobile first you’ll ignore the greatest customer you have….Google.
  • Mobile Index – to address the disparity between the mobile experience and a desktop experience Google created a second index of results thats independent of the desktop results.  You could be the number one result in desktop but not listed in mobile.  Since we noted more than half of searches are on mobile devices you would lose over half the opportunities for reaching a visitor.

Ignore mobile devices at your own peril.  Your SEO efforts are doomed in my humble opinion.  I don’t work on projects I dont think can succeed so I’d bow out if working on a project ignoring mobile.

Here is an example …. searching “rental homes in louisville” you’ll notice that the results come in a different order.

desktop vs mobile search results

It appears while realtor.com is considered better for desktops, zumper.com has a better mobile experience.  Its likely better laid out and faster than realtor.com so it beats that site in mobile.  Who wins here?  Zumper as more searches on mobile than desktop.  You think well…whats the big difference its one position…the difference between 3rd and 4th or 2nd and 3rd is huge.  The difference between 1st and 2nd position is more than 50%.  If this is a billion dollar industry we’re talking about potentially hundreds of millions of dollars. 

And it IS likely layout and speed.  Notice in the desktop view realtor.com has a higher trust score and a higher domain score.  Realtor.com has a higher rank, advertises more, was a domain first ….. in almost every SEO metric realtor.com should be higher than zumper.com.  BUT in mobile it is not.

Hits: 0

Load More

Hits: 63