We all saw it coming: Google has decided to shut down their social media app, Google+, over the course of the next year, officially pulling the plug in August 2019. Their plans were announced along with a revelation that a security flaw had potentially revealed the personal information of users (including name, age, and gender) from 2015 until it was discovered in March 2018. According to the search engine, no malicious activity was detected during this time – but we will let you decide what to believe.
It’s all over the Internet: Google is making a big change that affects every single website it browses. By October 2017, all sites with input fields, including contact forms and search fields, must have a security certificate—or risk dropping dramatically in Google’s estimation. Though Google has been set on enforcing “certificate transparency” for quite some time, this marks a tangible change in the company’s approach to insecure websites.
And isn’t Google the only search engine that matters?
So let’s talk about why you need to switch to https by October. Read the rest »
The general answer, of course, is as many as possible. However, for many businesses, especially law firms, obtaining as many reviews as possible is often too broad of a goal that often leads to less than ideal results. Hence, many lawyers want to simply know how many reviews they need to rank better, or to outrank their competition.
Read the rest »
Recently, tech freaks and internet marketing bloggers alike have been tweeting about Google’s alleged big project known as the Knowledge Vault. Some are touting it as the next evolutionary step in search engine power, while others are hailing it as a sign of the oncoming digital dystopia. If you’ve been out of the loop, you may be wondering just what the big deal is.To understand the Knowledge Vault and all of its implications, it is essential to know about its perhaps soon-to-be predecessor, the Knowledge Graph.
Launched by Google in May 2012, the Knowledge Graph is the company’s existing knowledge base that pulls information from a variety of community-curated sources, such as Freebase and Wikipedia. By drawing associations between the billions of items contained within its semantic network, it is able to produce “smart” search results that aim to inform users rather than just supply them with links. Read the rest »
When Google releases a new search engine algorithm update, it’s no surprise that the online community cringes at the prospect that their website’s search engine ranking may suddenly plummet.
But how does Google decide what to change about its algorithms in the first place? In order to fine tune its searches, Google draws on the help of thousands of Quality Raters who are contracted to evaluate the quality of Google’s search results for user queries. For example, a quality rater will determine whether the results for a search on “Panda updates” not only match the query but also provide high quality information on the subject.
So, how do quality raters evaluate Google’s search results? In the latest release of the Quality Rating Guidelines version 5.0, the raters take a look at search results for thousands of user queries and rated the level of “expertise, authoritativeness, and trustworthiness” found on the resulting websites. (The new criteria are certainly a mouthful but fortunately shorten down to the acronym E-A-T.)
Of course, E-A-T is in no way a new concept for SEOs, web content writers, and web designers. If you want your website to do well, no matter what update Google releases, it’s important to always focus on providing reliable information and building a strong reputation for professionalism, creativity, and skill.
How Can a Website Show Expertise?
In order to establish your website as a source of high-quality, expert information, whether you are a personal injury attorney or run a small business for a particular hobby, you should valuable insight and advice on specific topics. You do not have to be a credentialed expert by any means. The following types of information can help you establish professional or everyday expertise. Read the rest »
Ever wonder about how your website is reviewed and ranked by search engines like Google?Search engines use bots to “crawl” your site, gather data, and then plug that data into a search algorithm, highlighting what is important and what is spam, resulting in rankings for various key phrases used in web pages.
However, these algorithms are continually updated, which means websites are facing more and more scrutiny to make sure they have relevant content that Internet users are looking for.
Google’s most recent update to its search algorithm, Hummingbird, focuses on finding quality content that people are actually searching for, and does so using queries in the form of questions rather than simply picking out keywords here and there. Unlike previous Google updates, such as Penguin and Panda, the Hummingbird update is more of a complete overhaul of the algorithm instead of a simple upgrade. With such a large change, it is expected that 90% of all searches will be affected by Hummingbird.
Online marketing companies and corporations all remember the confusion and losses that resulted from Google’s Penguin 1.0 update during April 2012. Some are still looking for new ways to market online in its wake. So the beginning of Penguin 2.0 has predictably caused many companies to worry about their rankings and the tactics they should use to avoid harming their marketing campaigns.
Like many times of uncertainty in history, dating far back before the rise of the Internet, people are at risk of falling victim to those who use scare tactics that prey on people’s misunderstandings. Today, law firms across the country looking to recover from the effects of Penguin 2.0 are at risk of falling victim to people and companies who are telling them that their ranks have fallen and that they alone can help them recover.