top of page
Office Building

SEO Guides > SEO For Law Firms

SEO For Law Firms

seo.lawyer2.png

SEO For Law Firms

 

The way products and services are marketed has changed since the rise of internet usage. People are now spending more and more time online than they've ever been before. Mail orders have long been replaced by quick and easy apps that allow you to order everything online.

 

When someone wants to avail a very specific offered service, they don't immediately step out of their homes and go hunting from building to building—they'll go on their phones and find it on Google Maps.

The internet provides easy access to information and communication, so it was only a matter of time before the common practices in marketing shifted to adapt. Potential clients are more likely to hear about a product or service through Twitter or a paid Instagram Story. They're more likely to look through Yelp Reviews instead of calling up their friends for restaurant recommendations.

Of course, let's not forget what people always say: "Google It"

Not sure how much nutritional content there is in your new favorite cereal? Do you need to have someone clean your air conditioning unit? Should you sue someone for something they've done to you? Not everyone knows a nutritionist, technician, or lawyer that they can just call up and ask. What they do have is the internet, and they can ask whatever they want and get thousands (if not millions) of results.

Search Engine Optimization (SEO) was developed for this very reason. Ultimately, it is but a result of the changing technological landscape.

 

Your older clients are still going to find you the old-fashioned way, but best believe the younger ones are going to do as their peers often do: Google It. Shifting your Law Firm Marketing towards this new practice does not only serve to help you, but allows you to adapt.

In this guide, we’re going to discuss the Basics of SEO for Law Firm Marketing, which should include the following:

  • Why Search Results Are Important

  • Managing Your Site Design and Content

SEO doesn’t start and end with a great website and steady content production. It is the basic requirement for it, but there are so many things you can do to fine-tune your practice.

While starting out with SEO isn’t a steep learning curve, there are a few key tasks that you need constant repetition to master. The key to SEO is remembering what makes a more effective design and content.

 

Why Are Search Results Important?

 

As mentioned before, “Googling It” has become a thing of today’s time. It’s no longer just a language used in tech-savvy circles, but has bled into the common day-to-day vocabulary.

The search engine in itself has become a directory.

 

Do you need to know the exact date a celebrity was born? Do you need a list of movies you want to watch on Netflix? Are you looking for a service that’s similar to Netflix? Do you need a divorce? Do you want to know where to get studio equipment for music? Do you need some references for a research paper?

In a few seconds, you are able to access these directories. Even if you weren’t looking for specific information that can be satisfied in one search, you can look around a little more and get what you need anyway.

Here’s an example:

Say, you want to know how to effectively sterilize baby formula bottles at home. You type in, “how to sterilize baby bottles” on the search bar, and get a list of how-tos. Most of the results are going to tell you to boil it, but they will also suggest a couple of things to either improve what you’re doing or save time—for example, some results will suggest sterilization pills or steamers.

It’s not a need that’s satisfied by a quick answer. It’s not the same as finally finding out a celebrity’s birth date, but it gives you what you need by showing you what to buy or do.

So, yes—it is a new type of directory for goods and services. It is not the same as a paid ad. Instead of flashing paid advertising and waiting for a few people to be attracted to it, you’re offering it to those who are more likely to be in need of it.

 

Important Note: Why Google?

While it’s good to look at other search engines when you’re practicing SEO, it’s important to point out that most of the SEO learning materials out there were shaped around Google’s algorithm.

And why shouldn’t they be?

The phrase “Google It” already tells you how wide their user base is. More people are likely to get on Google compared to any other site—this means more people come across your pages when it ranks high. The principle here is maximizing your chances by looking at which search engine is going to give you the most traffic.

 

How Do You Show Up in the Search Results?

 

There are a million other results that come with a single search. What you need to think about now is how people are going to find you. The closer you are to the top result, the closer you are to the first page. Less and fewer people tend to click further into the searches, so you need to make sure more people can come across your content (and subsequently, your website).

Search engines operate on three of the following steps:

Crawling

Search the Internet for content, inspecting the script and information of each URL they come across.

Indexing

The content discovered during the crawling process should be saved and organized. Once a page has been added to the database, it will be considered for display as a result of valid queries.

Ranking

Provide the content that best answers a searcher's question, ensuring that results are sorted from most relevant to least relevant.

Once you post or publish content, it doesn’t immediately show up in the search results. These three basically cover what happens from when Google finds your site to getting it up there in the searches. It’s crucial that you know what happens every step of the way, and learn to adjust your content or design based on that.

 

Crawling

Googlebot begins by retrieving a few web pages, then explores the links on those pages to discover new URLs. The crawler will find new content by following this path of links and adding it to their vast database of discovered URLs, which can then be retrieved when a searcher is looking for details that the information on that URL is a great match for.

 

If you already have a site, you may want to start by checking to see how many of your pages are included in the index. This will reveal if Google is creeping and finding all of the pages you want it to and none of the pages you don't.

The advanced search operator "site:yourlawfirm.com" is one way to scan your indexed pages. Go to Google and check for "site:yourlawfirm.com" in the search box. This will return the results for the site listed that Google has in its index.

 

Track and use the Index Coverage report in Google Search Console for more reliable results. If you don't already have a Google Search Console account, you can build one for free. You can use this tool to upload sitemaps for your website and monitor how many of the pages you submit have been added to Google's index, among other items.

There are a few potential explanations why you're not coming up in the search results:

  • Your domain is new and has not yet been crawled.

  • There are no links to your site from other websites.

  • The navigation on your website makes it difficult for a robot to crawl it effectively.

  • Crawler directives, a form of simple code on your site, are preventing search engines from indexing it.

  • Google has penalized your website for using obtrusive tactics.

Most people worry about ensuring that Googlebot can locate their essential pages, but it's easy to overlook the fact that there are definitely pages you don't want found. Old URLs with inadequate content, duplicate URLs, special promo code pages, test pages, and so on are examples.

 

What Are Robots.txt?

Robots.txt are found in the root directory of websites that use basic instructions to tell search engines which parts of your site they should and shouldn't crawl, as well as the speed at which they crawl your site.

Googlebot's approach to robots.txt files are as follows:

  • When Googlebot can't locate a robots.txt file for a website, it crawls it.

  • If a site's robots.txt file is found, Googlebot will normally follow the instruction and crawl the site.

  • If Googlebot finds an error when attempting to access a site's robots.txt file and is unable to decide whether or not one exists, the site will not be crawled.

 

Robots.txt is not followed by all web robots. Bots that don't obey this protocol are designed by people with poor motives. In reality, some bad actors use robots.txt files to figure out where you keep your personal information.

 

While it might seem sensible to prevent crawlers from disclosing confidential pages such as login and admin pages, putting the position of those URLs in a publicly available robots.txt file allows people with malicious intent to locate them more easily. Rather than include these pages in your robots.txt file, you can NoIndex them and lock them behind a login method.

As a search engine crawls your web, it may be able to locate certain pages or sections, but others may be hidden for various reasons. It's important to ensure that search engines can find all of the material you want to be indexed, not just your homepage.

Common navigation errors that prevent crawlers from seeing your entire site include:

  • You currently have a mobile navigation system that produces divergent results compared to your desktop navigation system.

  • You might be using JavaScript-enabled navigations (or any form of navigation where the menu items are not in the HTML).​

  • Although Google has improved its crawling and understanding of Javascript, it is still far from perfect. Putting something in HTML is a more reliable way to ensure that it is found, recognized, and indexed by Google.

  • Your navigation doesn't link to a primary page.

 

Quick Tip!

"You also might want to use sitemaps. It's a list of your site's URLs that crawlers can use to find and index your content. Creating a file that meets Google's expectations and submitting it through Google Search Console is one of the simplest ways to ensure Google finds the most important sites. Although submitting a sitemap isn't a substitute for good site navigation, it can definitely assist crawlers find all of your important sites."

Indexing

After you've confirmed that your website has been crawled, the next step is to ensure that it can be indexed. A search engine's ability to discover and crawl your web does not guarantee that it will be included in their database.

 

Your found pages are saved in the index. When a crawler discovers a website, the search engine does it the same way as a browser. The search engine evaluates the contents of that page when doing so. Its index contains all of the details.

 

Pages Can Be Deleted From the Index

Some of the most common reasons for URL removal include:

  • The URL returns a "not found" or "server error."

  • A noindex meta tag was applied to the URL.

  • The URL was deleted from the index after being manually penalized for breaching the search engine's Webmaster Guidelines.

  • The URL has been blocked from crawling, and visitors must enter a password before they can access the website.

 

If you assume a page of the website that was previously in Google's index has disappeared, you can also use the URL Inspection tool to determine its status, or you can use Fetch as Google's "Request Indexing" function to add individual URLs to the index. (As an added bonus, GSC's "fetch" tool has a "render" feature that lets you see whether there are any problems with Google's interpretation of your page.)

 

Ranking

 

Search engines use the information they get from crawling and indexing to determine the best possible match to what people are typing in the searches. This is how they go about ranking. The closer a link or result is to the top result, the more likely the engine thinks it's relevant to the search.

Search engines use algorithms to evaluate significance, which is a method or formula for retrieving and organizing stored data in meaningful ways. Over time, these algorithms have undergone several improvements in order to increase the consistency of search results.

Start Growing Your Law Firm Today

Untitled design-3.png

Beat your local competition and generate more leads with underground SEO strategies and tools.

How Google Decides Ranking

The machine learning part of Google's core algorithm is called RankBrain. Machine learning is a form of a computer program that learns from new observations and training data to improve its predictions over time. In other words, it's still learning, and as a result, search results should be improving all the time.

 

If RankBrain finds that a lower-ranking URL is delivering a better result to users than higher-ranking URLs, you can bet that the results will be adjusted, with the more relevant result going higher and the less relevant pages being demoted as a byproduct.

Since Google will continue to use RankBrain to promote the most appropriate, helpful content, we must concentrate more than ever on fulfilling searcher’s purposes. You've taken a major step toward performing well in a RankBrain environment if you have the best possible knowledge and experience for searchers who could land on your website.

Managing Your Site Design and Content

So you know how Google works with what you put out, now’s the time for some tips on the kind of content that Google will crawl, index, and rank in your favor.

Using the Right Keywords

If you've ever dabbled in SEO articles before, you’ll find that “keywords” (along with “links”, discussed later) are almost always mentioned. Think about it: people type in specific words into the search bar, so it’s only natural that you know what they’re looking for.

Recognizing your target audience as well as how they look for your material, services, or goods is the impact of relevant keywords. Keyword research offers detailed search data that can assist you in answering questions such as:

  • What are they looking for?

  • How often are people looking for it?

 

You might have also prepared a bunch of keywords that you'd like to rank with. You may use a keyword research tool to find out the monthly average search volume and related keywords for those (there are plenty of online tools out there, and they're free!).

When you're researching keywords, you'll probably find that the search volume for those keywords varies a lot. Although it's important to target words that your audience is looking for, in some situations, targeting phrases with lower search volume might be more advantageous because they're less competitive.

Note that keyword popularity changes throughout the year, depending on a variety of factors—it could be the holidays, the news, or the release of a new controversial product. You should look for different places each time.

Here are some ideas:

Competitor's keywords

Your competitor is likely vying for the same top rank as you. Try entering a popular search term you want to rank in, look at the top result, and see what they're doing that's making them rank high. You're likely to see some repeated keywords among the results on the first page, too, so look into those.

Seasonal keywords

Understanding seasonal patterns will help you plan your content strategy. If it's getting closer to the holidays, people might be looking for "Christmas Presents" or "Snow". Bathing suits won't be as popular in the winter as they are around the summer. While law firms don't normally offer seasonal services, you might want to add some seasonal keywords in your regular articles and blog posts.

Region-specific keywords

Using region-specific keywords is great for narrowing down your target market (and can easily be filtered in most keyword apps!). These can include region-specific vernacular most used in a particular area. Even if you aren't well-aware of them, online keyword tools can help you with them.

You can also try adding your firm's specific location; like your city, county, or state names. For example, try using "Los Angeles Employment Lawyer" or "Orange County Criminal Defense Attorney."

 

Keyword Tip #1:  Use Keywords in Alt Text

The use of alt text (alternative text) inside images is a web accessibility concept that is used to explain images to visually disabled people. It's important to include alt text explanations in your images so that visually impaired visitors can understand what they're looking at.

Search engine bots also crawl alt text to better interpret your images, giving you the added advantage of giving search engines better image meaning. Simply make sure that your alt explanations read naturally to people and stop keyword stuffing for search engines.

 

Keyword Tip #2: Use Your Anchor Text Wisely

The text you use to connect to pages is known as anchor text. Search engines are alerted to the content of the destination page by the anchor text. You might see these in articles that redirect you to more elaborate explanations or other related pages.

For example, you can use “Divorce Attorney” as an anchor text to a link that goes to an article or page discussing Divorce or Family Law.

Do note: Be careful not to overdo it. When you have a lot of internal links with the same keyword-stuffed anchor text, search engines will think you're trying to manipulate the page's ranking. It's preferable to use natural rather than formulaic anchor text.

 

Using Links

As mentioned, SEO articles have common words that always make it into the learning materials. One was “keyword”, and the other is “links”. The goal of SEO is to point potential clients to your offered goods and services—basically, leading them to a particular page that will get them to consider your products.

When we say "links," we may be referring to one of two things. Internal links are links on your own site that lead to other pages, while backlinks are links from other sites that point to yours. In the past, links were extremely important in SEO. Back then, search engines needed assistance in determining which URLs were more reliable than others in order to decide how to rate search results. They were able to do this by comparing the number of links pointing to each page.

PageRank (a component of Google's core algorithm) is a link analysis algorithm that calculates a web page's relevance by analyzing the content and quantity of links leading to it. The presumption is that the more current, significant, and trustworthy a website is, the more links it will attract.

 

Linking Tip #1: Link accessibility

Links that involve a click (like a navigation drop-down to view) are often hidden from search engine crawlers, so if the only links to internal pages on your website are through these types of links, you may have difficulty getting those pages indexed. Instead, choose links that can be accessed directly from the list.

 

Linking Tip #2: Be Mindful of Link volume

Getting too many internal links won't get you penalized on its own, but it will affect how Google identifies and tests your pages. A link is a way to help users to navigate to other pages on the site, in addition to transferring authority between pages. This is one of the instances where doing what is right for search engines is also best for searchers. A large number of links not only dilutes each link's authority, but they can also be unhelpful and confusing.

 

Linking Tip #3: Redirection

Removing and renaming pages is standard practice, but if you do so, make sure to change any links that point to the old URL. You can at the very least redirect the URL to its new position, but if possible, update all internal links to that URL at the source so that users and crawlers don't have to navigate through redirects to reach the destination page.

Inputting Your Title Tags

The title tag on a website is a descriptive HTML feature that defines a web page's title. A special, descriptive title tag should be used for each page on your website. What you type in the title tag field will appear in search results, but Google can change how your title tag appears in search results in some cases.

 

What makes a title tag effective?

Use of Keywords

 

The use of your target keyword in the title will aid both users and search engines in determining the purpose of your website. Furthermore, the closest your keywords are to the front of the title tag, the more likely a user will read them (and possibly click), and the more useful they will be for rating.

Length

In search results, search engines usually show the first 50–60 characters (512 pixels) of a title tag. If the title tag is longer than the SERP's character limit, an ellipsis "..." will appear where the title was cut off. Although sticking to a character count of 50–60 characters is secure, consistency should never be sacrificed for a strict character count. Go longer if you can't get your title tag down to 60 characters without sacrificing readability.

 

Adding Meta Descriptions

Meta descriptions, including title tags, are HTML elements that explain the contents of the page they're on.

 

What makes A Meta Description Effective?

Successful meta descriptions share the same attributes as effective title tags. While meta descriptions, including title tags, are not a ranking factor, they are absolutely crucial for click-through rate.

Relevance

Meta descriptions must be strongly relevant to the topic of your website, so they should in some way summarize your main concept. You must give the searcher enough details to know that they've found a page that's important enough to address their query, but not so much that they don't need to click through to your website.

Length

Meta definitions are usually truncated to about 155 characters by search engines. Meta details should be between 150 and 300 characters long. On some SERPs, you'll find that Google gives the explanations of some pages a lot more room. This is most common for web pages that rank immediately below a featured snippet.

 

Creating Your URLs

The Uniform Resource Locator (URL) stands for Uniform Resource Locator. URLs are the web addresses or locations for specific bits of content.  Search engines show URLs on the SERPs, just like title tags and meta descriptions, so URL naming and format can affect click-through rates.

 

URLs are not only used by searchers to decide which web pages to click on, but they are often used by search engines to evaluate and rate pages.

Search engines need unique URLs for each page on your website in order to view them in search results, but simple URL structure and naming are also beneficial for people trying to figure out what a particular URL is about.

 

Users are also more likely to click on URLs that confirm and explain the details on a website and less likely to click on URLs that lead to confusion.

 

URL Tip #1: The Length of the URL

While a fully flat URL structure is not needed, several click-through rate studies show that when given the option between a lengthy URL and a shorter URL, searchers prefer the shorter URL. Excessively long URLs would be cut off with an ellipsis, like how title tags and meta descriptions are cut. Note that having a descriptive URL is just as crucial as having a short URL, so don't shorten your URL if it means losing its descriptiveness.

 

URL Tip #2: Keywords in URL

Be sure to include the term or phrase in the URL if your page is targeting a particular term or phrase. However, don't overdo it by attempting to cram too many keywords for SEO purposes.

 

It's also important to keep an eye out for keywords that appear in multiple subfolders. For example, you might have naturally included a keyword in a page name, but if that page is located inside other files that are also optimized with this same keyword, the URL might appear keyword-stuffed.

Your Website Should Be Easy to Use, But Not Boring

 

Have you ever clicked on a google search result, be bombarded by a bad site design, and immediately get turned off? This is what a bad user experience looks like. If your site is glitchy, hard to read, or hard to navigate, the more likely it is for a searcher to click off of it.

 

If you make it hard for a user to find your law firm’s contact information amongst the sea of links or unreadable text, you might actually lose potential clients.

Below are tips you can use to improve the overall user experience on your website:

Prioritize Accessibility

You might want to add all the bells and whistles that make your website look attractive to more viewers, but all that aesthetic is trumped by accessibility. If it's hard to browse through the areas of law your firm specializes in, users might click off and search for the specific thing they're looking for. If it's hard to find the number and email they should be contacting for appointments, they'll leave.

If site animations and formatting are becoming disruptive to the quality content you're trying to get people to read, then the pretty designs will not have served you anything.

Smartphone Users Should Not Have to Struggle

Given that mobile accounts for well over half of all web traffic today, it's fair to assume that your website should be mobile-friendly and simple to navigate. If the user has to zoom in and out of the desktop version of the site (a rarity these days), then that's a terrible user experience. As mentioned above, you should make it as convenient as possible for the user; otherwise, they'll get turned off.

Google released an update to its algorithm in April 2015 that prioritized mobile-friendly pages over non-mobile-friendly pages. Here are a few things you can do to make it mobile-friendly:

Responsive Site Design

Responsive sites (designed to adapt to the screen size of whatever computer the visitors are using) are the best way to go. CSS can be used to make a web page "react" to the size of the screen. This is advantageous because it eliminates the need for users to double-tap or pinch-and-zoom to access the material on your site. Are you unsure whether or not your web pages are mobile-friendly? You can use Google's mobile-friendly test to see if your site is mobile-friendly.

AMP

Accelerated Mobile Pages, or AMP, is a technique for delivering content to mobile users at much faster speeds than non-AMP delivery. AMP is able to deliver content so quickly because it uses a special AMP version of HTML and JavaScript to deliver content from its cache servers (rather than the original site).

Mobile Priority on Indexing

Google began indexing websites with mobile-first indexing in 2018. The distinction between mobile-friendliness and mobile-first has been blurred as a result of this shift. Google crawls and indexes the mobile edition of your web pages with mobile-first indexing. Making the website mobile-friendly is beneficial to users and search rankings, but mobile-first indexing occurs independently of mobile-friendliness.

This has raised some issues for websites that do not have parity between their mobile and desktop versions, such as displaying different content, navigation, links, and so on. For example, a mobile site with different links can change the way Googlebot mobile crawls and distributes your site link equity to your other sites.

Condensing and Bundling Files

“Minify resource” is a common recommendation in page speed audits—but what does that mean? Minification reduces the size of a code file by eliminating line breaks and spaces, as well as abbreviating code variable names wherever possible.

Another word you'll hear in relation to improving page speed is "bundling." Bundling is the method of combining many files that use the same coding language into a single file. A group of JavaScript files, for example, may be combined into a single larger file to minimize the number of JavaScript files required by a browser.

You can speed up your website and reduce the amount of HTTP (file) requests by minifying and bundling the files required to construct it.

 

You’re Just Getting Started

SEO and its experts have always made one thing clear: it’s an ever-changing thing, so you should always try to adapt. Google changes its algorithm as the years go by—but it’s not something to be afraid of or give up over. If there’s one thing that isn’t going to change is the constant use of search engines in our day-to-day lives.

 

There’s simply no way Google (or people “Googling It”, for that matter) is going to go away in the near future.

What you need to do is constantly practice, learn, and adapt. If this is one of the first guides you’ve ever read about SEO, maybe the buzzwords are intimidating you right now—but that’s okay. You can use this guide as your checklist (or make your own!) and keep doing it until it’s almost instinctive. Repetition makes it easier to remember, and the more you do it, the faster and more efficient you’ll be.

And if you can’t do it yourself? Find someone to help you. SEO becoming a widely-practiced form of marketing means that it’s normal for companies, individuals, and firms to hire experts or seek advice from them.

why
how
robots
decides
managing
links
title tags
meta
urls
boring
get started
JC Serrano.png
Linkedin-logo-png.png

JC Serrano | Founder 

LawyerLeadMachine

bottom of page