SEO Interview Tough Questions with Easy Answers

0
222
SEO Interview Tough Questions With Easy Answers

What is meant by Panda Update?

Google has introduced the Panda update to stop poor quality or thin content websites to rank on higher SERPs and Google has also decided to make Panda a part of Google’s Core Ranking Signals. The current version of Panda is 4.2 and it may come with many more updates to make the Google SERPs with quality and relevant content to the user’s search query.

What is meant by Penguin Update?

Google Penguin algorithm is all about the black hat SEO techniques that have been performed by the websites and trying to build links using techniques like link schemes, PBN (Private Blog Networks), buying linking through link networks, etc. mainly to improve their rankings

What is Hummingbird?

Google Hummingbird algorithm is about the search context of the users. This algorithm completely understands the user search context not just only looking for a word in the query but looking at each word in a query. It means Hummingbird intelligently understands the user’s intent and gives the right answers to its users. This concept is called semantic search. The main objective of this algorithm is that pages matching the meaning of a search query do better, rather than pages matching just a few words.

What is the RankBrain algorithm?

RankBrain is a machine learning artificial intelligence system. It helps Google to process search results and provide better results to user queries. RankBrain is also an important factor in the ranking algorithm along with links and content

What strategies do you follow to make a website rank higher in SERPs?

This is a tricky question asked by the interviewer to see how the interviewee can handle the SEO aspect and improve the performance of a website.

Well, initially I would see the Domain Authority of a website to see the competency level of the website. After looking at it you get a good level of understanding about the website and can explore the strategies to follow. 

To rank well in the search engines I would look for the following on an initial note.

  • Keyword research
  • Track the right metrics like Keyword Rankings, CTR, Visitors to the website, and conversions from SEO
  • Focus on building inbound links with quality
  • Research competitors’ websites and links pointing to them
  • Duplicate Content issues, broken links
  • Targeting the right keywords in Titles, Descriptions, H1 tags, and the body of the content
  • Improving the loading speed
  • Creating the Social Media presence
  • Alt Attributes to the images

How do you analyze the Competitor Sites?

  • First, find out the right keywords to search for your competitors
  • Search with the keywords and list out the top 10 competitors
  • Check their Backlinking profile using the tools like Moz.com, Ahrefs, Majestic, and SEMRush
  • Evaluate Your Competition’s Presence on Social Media
  • Compare with your site and see where you stand out

What is Data Highlighter?

Data Highlighter is a tool in the webmaster tools to make Google understand the pattern of structured data on your website. We can simply tag the data fields on our site with a mouse. Then Google can present that data more attractively in the search results.

You can use Data Highlighter to help Google understands the following types of data on your site:

  • Articles
  • Events
  • Local Businesses
  • Restaurants
  • Products
  • Software Applications
  • Movies
  • TV Episodes
  • Books

What is Schema? How do you use it on your site?

Adding Schema to your pages makes the search engines can easily understand what the element is all about. Say if you are running an event we can easily tell the search engines when the event is going to start and when it is going to end. Whereas, with standard HTML code it is not possible to explain

Schema codes can be available at schema.org and we have all different types of Schemas.

Is meta keywords tag very important these days?

No. search engines don’t consider meta keywords as a ranking signal, especially Google. Too many people have spammed that too much. Google doesn’t use this information at all.

The only thing it can help is for your competitors to see what keywords you are trying to target.

What is Fetch as Google in the webmaster tools?

Fetch as Google is a tool in the webmaster tools to test if Google can crawl your web page,  and is useful for debugging crawl issues on your site.

You can simply put the URL in the tool and see whether Googlebot can access your site, how it renders a page and whether any page resources (such as images or scripts) are blocked by Googlebot.

When you want your site to make it down for maintenance what do you tell Search Engines?

Simply we will put the HTTP Status code as 503 Service Unavailable. This tells the spiders that the server is currently unable to handle the request due to a temporary overloading or maintenance of the server.

Name some outstanding tools that you use in your day-to-day SEO Work

Screaming From (Technical SEO), Ahrefs, Moz, Majestic (Backlinking Profile and other Link building tactics), SEMRush (Stealing Organic keywords of competitors and Paid Ads analysis, Keyword Ranking Report, and more), Google AdWords Keyword Planner (Keyword research)

What are the main KPIs you think in order to improve the customer experience on the website?

Acquisition (Cost per acquisition, CTR, % of new visits), Behavior (Bounce Rate, Abandonment rate, Avg time on site), and Outcomes (Conversion rate)

What is robots.txt and what are its parameters in creating it?

Robots.txt is a text file that can be placed in the root folder of the website. The main purpose of using the robots.txt file is to restrict the robots to crawl and index certain pages on the website.

How do you measure the success of the SEO of a website?

Measuring the success of an SEO campaign can vary greatly depending on the type of business you’re in and your objectives. But we can consider the main KPIs for any SEO success of a website are:

  • Domain Authority
  • Rankings 
  • Traffic
  • Conversions

What tools do you use in tracking the Organic Traffic

Google Webmaster Tools (Now it is Google Search Console), Google Analytics

Can we integrate the Google Analytics and Webmaster tools if they are not on the same account?

Yes, you need to have admin access to link both accounts.

What are your day-to-day SEO activities and how do you perform them?

My day-to-day activities in SEO are like this:

  • Keyword ranking report
  • Checking for site errors in GWT
  • Addressing the technical issues
  • Keyword research for long tail and LSI versions
  • Social Media postings 
  • Checking for new trends, news, and updates in the SEO industry
  • Competitive Analysis
  • Competitor Backlink profiling
  • Content promotion through Articles, Blogging, Guest Blogging
  • Link Building like Broken link building, checking Competitor’s backlinks and getting the same backlinks, Blog commenting and other sources of link building
  • Local SEO: Citations, Posting on Classifieds, Mentioning the NAP (Name, Address, Phone Number)

How do you know that your website has been hit by Panda or Penguin and how you would take action on it?

If there is a sudden drop in the organic traffic then it is a clear sign to check our webmaster tools and see whether we have any manual webspam actions found.

The easiest way to check whether you have hit panda or penguin is by using tools like:

http://barracuda.digital/panguin-tool/, or http://feinternational.com/website-penalty-indicator/

What is local SEO and how do you rank a website locally?

Local SEO is all about optimizing your website for Local search results.

In order to rank locally, we need to 

  • Claim the Google Business page
  • local NAP citations
  • Local On-Page SEO Factors (Add your local city/region along with the keyword in your URL, title, description, H1 tag, image alt tag and in the body of the Content)
  • Embed your store location using the Google Map on the landing page
  • Local link-building and Citations

When we use a Disavow tool?

When we have any spammy backlinks pointing to our site, then we use the disavow tool to tell Google to not count specific backlinks. This way we can tell Google to remove certain URLs or domains as a ranking factor in the indexing of your site.

If you have been given a website to analyze. How do you analyze it?

First things first, we do site audit analysis, using the tools like the Screaming frog, technical aspects like Meta Titles and Descriptions, H1 tags, User-friendly URL Structure, Internal linking,  Image alt tags, duplicate content issues, broken links, sitemap.xml and HTML, robots.txt, check the server errors, code errors and others.

Secondly, checking the off-page SEO factors like backlinking profiles (Using tools like Moz, Ahrefs, Majestic, SEMRush, and others). Domain Authority, competitive analysis, social media existence

In how many ways you can tell Google not to index your website?

Using the Meta Robots tag in the HEAD section of an HTML page like <META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>

You can also tell Google using the robots.txt file below

User-agent: Googlebot 

Disallow: /this-page-will-be-blocked.html

What is a backlink? How do you analyze the quality of a backlink? 

A backlink is an incoming hyperlink from one web page to another. These links are important in determining the popularity of any Website. The quality of the backlink can be checked through the domain/page authority, relevancy and dofollow or nofollow, and the anchor text used for creating the link.

What kind of SEO reports do you prepare to know the performance of a website?

  • Keyword Rankings Reports
  • Competitors’ Rankings Reports
  • Link-Building Reports
  • Traffic Overview report
  • Conversion/Sale Report

Explain some Best SEO practices.

  • On-going keyword research
  • On-going competitive analysis
  • Finding linking opportunities to build Domain Authority
  • Rewriting the titles and Descriptions to include the target keywords and testing different versions
  • Curating the content with the right keywords
  • Learning new trends and Algorithms in the SEO industry. Keep updating the site with the new algorithms

What is a Citation and NAP in terms of Local SEO?

A Citation is generally referred to as a mention of your business name, address, and phone number on the web. The combination of these 3 pieces of information is often referred to as a NAP in short. The citations should be exactly the same wherever they appear on the web. Google uses these mentions when evaluating the online authority of your Local business.

What is Anchor text diversity, and how do you measure it?

Anchor text diversity will clearly tell you how good or poor is your website using your anchor texts across. Over Optimized Anchor texts, means getting every outbound link to your site using the same keyword. This tells Google that you’re over-optimizing a single keyword to rank higher in the SERPs.

You can measure the Anchor text diversity using the tools like MOZ, Majestic, Aherfs, etc.

How you will save your client’s websites from penalization or being banned? 

Avoiding any kind of over-optimization, black hat SEO, unnatural methods for creating backlinks, Avoiding copying content from other sources, and making sure no thin content on the website 

How you will observe to find problems with why a website is not ranking?

  • Check whether the website is indexing or not
  • The loading time of the website
  • Internal linking 
  • Are there any authority backlinks to the website
  • Curate the content
  • Optimization like overuse of the same anchor text
  • Backlinks from penalized/spammy websites or the same type of link-building method every time we do
  • too many backlinks in less time

How do you optimize SEO for eCommerce websites?

  • Right-targeted keywords for the product pages
  • Looking for duplicate titles or descriptions tags across the site and re-writing them
  • Using Breadcrumbs
  • Image alt tags
  • Competitor research
  • Incorporate customer reviews
  • Integrate social media on product pages
  • Optimize page load speed
  • Add rich snippets
  • Use search-friendly URLs

How Do you optimize the Canonical tags?

Canonical URLs to improve link and ranking signals for content available through multiple URL structures or via syndication.

By using, the Canonical link tag we can prevent duplicate content issues by specifying the “canonical” or “preferred” version of a web page.  This we can tell Google to treat multiple versions of URLs as the same. 

The canonical link allows you to have content on various products or services and can be accessible under multiple URLs, subdomains, or even on multiple websites, without harming your search rankings.

Name the SEO tools that you can’t work without.

MOZ, Majestic, Screaming Frog, Ahrefs, SEMRush, Google Analytics, Google Webmaster tools

I give you 1,999 URLs and need you within 24 hours assess to find out which one of these is the most valuable link prospect. How would you achieve this?

Only the interviewer is checking, how good you are at analyzing things. It’s a simple answer. Just go to Opensiteexplorer enter the domain name you want to see in the search box download all the backlinks to a spreadsheet then sorts it out with Domain Authority, Page Authority followed by Dofollow. This report would give you the most valuable Link prospects.

I have a website with over 5k pages and need to assess which of these have no title tags or wrong title tags; how can I go about doing this at scale?

Use a tool like Screaming to put your domain name in the search box and wait for a few minutes to download all the data in the software. Then Navigate to titles and see which titles are missing and which ones are wrong. You can also download this report into excel and then do your analysis easily.

If you see a sudden drop in the number of indexed pages, what could be the reasons?

It might mean that the website server is down or overloaded, or that Google is having trouble accessing the website content.

If you see an unusually high index volume for a website in Google webmaster tools, what could be the reason behind it?

A high number of indexed URLs could mean that the website has problems with canonical URLs, duplicate content issues, automatically generated pages, or the website has been hacked. 

I hope these questions really help you in getting succeeded in getting your new SEO job. Let me know if you have any advanced SEO interview Questions, and I will definitely add them to this list.