What Is Google SafeSearch? Information for Parents

Turn On SafeSearch for Your Kids

0
323

What is Google SafeSearch?

From the Google SafeSearch website: “Whether you’re using Google Search at work, with children, or for yourself, SafeSearch can help you filter sexually explicit content from your results.”

So, Google SafeSearch is focused on filtering and/or blocking sexually explicit content. That leaves open to interpretation about what qualifies as sexually explicit. The mechanisms for blocking content are blocking search terms (words you put in your browser), webpages, websites and images/videos.

Google SafeSearch attempts to address each of these levels of access to sexually explicit content.

How effective is it?

First a couple of concepts:

Overblocking refers to the circumstance where content control software blocks search terms, websites, pages, or images that are not sexually explicit. This happens when the logic for blocking is adjusted in a way that filters content that is not sexually explicit. For example, a particular phrase or search term is used in identifying sexually explicit material. Filtering products make changes to block that phrase or search term, which also has non-sexually explicit usages, which results in blocking of content that is not sexually explicit.

Underblocking refers to the circumstance where content control software fails to block search terms, websites, pages, or images that are in fact sexually explicit. This most frequently happens between the time that the publishers of sexually explicit material improve their process for bypassing the filters and the time the filters are updated to combat the publisher’s efforts.

There is a constant battle being waged between the purveyors of sexually explicit content and the content filtering service providers. The distributors of sexually explicit content, like most internet based business models today, depend on traffic to generate revenue. To that end, filtering services are in the business of denying sexual content providers traffic, and therefore revenue. The result is a constant battle of the sexual content providers working to understand and bypass the logic and algorithms behind the blocking software in order to evade the filtering process. On the other side, the filtering service providers are working to provide constant updates so that the sexually explicit content continues to be blocked.

Sexually explicit content providers are constantly changing URLs in order to evade the URL blocking aspect of content filtering.

One of the tools that providers like Google SafeSearch use, is a manual reporting process for material that has evaded the filters. There is a link provided in Google’s SafeSearch process for users to report sexually explicit content that has managed to slip through a crack in the blocking and filtering process. This helps Google to investigate where the cracks on the process are and to modify the process in the ongoing battle to block the content.

As a parent, I would not depend on any solution to reliably block offensive content 100% all of the time. While I would use content filtering as one of the tools in my arsenal of weapons under the heading of “controlling access”. I also suggest monitoring internet activity and keeping the lines of communication open with our children.

How does it work?

Again from Google: “When SafeSearch is on, it helps filter out explicit content in Google’s search results for all your queries across images, videos, and websites. While SafeSearch isn’t 100% accurate, it’s designed to help block explicit results, like pornography, from your Google search results.”

Any content control technology has limits no matter the provider, technology, etc. There are a few types of content blocking technology. Some solutions use one, many or all of these methods to control content. At a basic level the options available are:

  1. Blocking search terms. This refers to blocking words that are commonly associated with sexually explicit content. For instance, the search term “porn” would logically be blocked from being an available search term in Google SafeSearch. If Safe Search is enabled and a blocked term is used in the search process then an error message is received by the user.
  2. Blocking URLs. If SafeSearch is able to determine that an entire website should be blocked because of explicit content, then access to the URL (i.e. www.badwebsite.com) will be blocked, and the user will not be able to access anything on that website. If the system determines that certain pages on the website should be blocked (for example, certain articles on a news website that discuss sexually explicit topics), then the specific pages will be blocked.
  3. Tagging images. The process of identifying sexually explicit images through software products that perform analysis on the images advances continually. Originally, the processes used the percent of skin tone colors present in the images. However, the process used to identify images has become much more sophisticated to the point where images can be better identified by software than by humans. The issue comes down to what the process designers determine constitutes a sexually explicit image more than anything.

Are there loopholes?

There are always ways to get around technology and Google SafeSearch is no exception. If someone wants to circumvent a system bad enough they will figure out how to work around it.

The most glaring loophole in SafeSearch is that the process can be turned off by the user. In the case of children, Google has decided that 13 (in the US it is age 13, but it varies by country) is the age at which an individual can independently determine whether they want to be constrained by SafeSearch or not. At the age of 13, without parental consent, the child can simply turn it off. Why is this allowed at age 13? Because that is the age at which individuals can register for a Google account without parental consent and it complies with the federal regulations as set forth in COPPA (Children’s Online Privacy Protection Act) passed by Congress in 1998. COPPA prohibits organizations from collecting data about children under the age of 13 without parental consent.

About the Author

Terry has been an entrepreneur in the IT industry for over30 years. Go here to read his complete personal story, “Husband, father, Grandfather and IT Executive.” If you want to send Terry a quick message visit the contact page Here.

If you would like to receive weekly email updates about how to talk to, monitor, and control your child’s digital experience, please join our mailing list below.