Snappy, The HostGator Mascot

Gator Crossing

The Official HostGator Company Blog!

Dragonfly
AirPlane

Web Hosting News

What Is Net Neutrality And Why Does It Matter?

Written by Brandi Bennett

Wednesday, May 7th, 2014

Net neutrality is fundamentally the basic premise that all online data should be treated equally.  In a nutshell, this means that information should flow freely without, discrimination,  blocking or throttling internet usage by all ISPs (Internet Service Providers) or any governmental intervention; uncensored access, equal access, and unrestricted access for everyone.  As Senator Franken (D. – Minn.) has been quoted, “Net neutrality is the First Amendment issue of our time.”  The Internet was designed as an open medium of communication, in which all users are able to access all content without being restricted from doing so (with obvious exception being given to certain legalities related to certain types of content that fall beyond the scope of this blog post).

There are many arguing that net neutrality no longer exists.  The FCC’s previous rulings on the matter were recently struck down, but in light of the publicity that the “citizens of the internet” have brought to this issue (including protests), the FCC is taking steps to create new net neutrality rules and ostensibly working to keep the public’s desires at heart (a first for the FCC, one could argue!). The FCC’s actions are not entirely altruistic, being concerned with the creation of monopolies and the like, but, the fact of the matter is that net neutrality is not yet dead… and that means that it’s not too late!

If net neutrality ceases, we could be looking at an internet bogged down by fees, where users must pay to access certain types of content. One in which the various streaming services available today, from NetFlix to Amazon would be imposed additional tolls that of course would then be passed onto the end users of their services.  This would not affect just streaming services, but all content.  Say you wanted access to the news websites, you could be charged a fee, and another fee could be charged if you wanted to look at internet memes. The sky would be the limit if net neutrality dies out completely. So yes, pay attention to anything involving net neutrality, and remember, as we said back in 2011 – “We here at HostGator support a free internet. An internet in which free information and unhindered distribution of said information is an unalienable human right.” We still stand by this statement and we believe that you need to know what’s going on in the world of the internet today!

Image Source: Color Lines. (2014). Net Neutrality. [image online] Available at: http://colorlines.com/assets_c/2013/09/net_neutrality_081310-thumb-640xauto-629-thumb-640xauto-9121.gif [Accessed: 27 Mar 2014].

Heartbleed Bug

Written by Sean Valant

Thursday, April 10th, 2014

You may have now heard of the “Heartbleed Bug.” Before we continue, we want to reassure you that if you are hosting on a HostGator shared or reseller server, that your server has already been patched. For everyone else, HostGator customer or not, we have created the following tool to assist you with determining whether or not your site is presently vulnerable and what further action to take, if necessary: https://heartbleed.hostgator.com/

heartbleed bug

Now, what exactly is the Heartbleed Bug? Technically speaking, it is a serious vulnerability in the popular OpenSSL cryptographic software library. In layman’s terms, it allows the ever-present nefarious individuals the ability to intercept and decode encrypted data. The following quote comes from heartbleed.com:

“The Heartbleed bug allows anyone on the Internet to read the memory of the systems protected by the vulnerable versions of the OpenSSL software. This compromises the secret keys used to identify the service providers and to encrypt the traffic, the names and passwords of the users and the actual content. This allows attackers to eavesdrop on communications, steal data directly from the services and users and to impersonate services and users.”

The bug is so-named due to a normal function between two computers across a network (such as the Internet) sharing an encrypted connection. The “heartbeat” is simply a pulse, or packet of information, sent from one machine to the other to ensure the connection still exists. This functionality is what allows the exploit to occur, in that the heartbeat is simulated by a third party in such a way as to allow them access to the memory of the receiving server.

What this translates to is virtually unlimited, and untraceable, access to a myriad of private information which potentially can include usernames, passwords, and even credit card information. The full extent of the situation is not presently known. What is known is that we should all consider all of our passwords to be compromised. As a result, you absolutely want to update any passwords for anything and everything you log into online. However, if you change your password for an account on a server that has not been patched, then you can consider the new password compromised as well.

For full information regarding this situation, we recommend reading the associated Wikipedia article.

Google Transparency Report: Government Removal Requests Continue to Rise

Written by Taylor Hawes

Wednesday, January 22nd, 2014

Google Transparency Report Government Removal Requests Continue to Rise

Revelations surrounding government monitoring of heavily populated data streams and communications channels are on everyone’s mind as such news introduces a sea change of perception about our privacy rights. The simple fact is, our browsing habits and published content are no longer as “free” as they were once understood to be.

But even beyond our the limitation of our own expression, what’s most concerning about these developments is the new and unprecedented level of authority governments are attempting to exercise over the world’s most important communications medium. With Google’s report as an accurate barometer, the situation is clear: the rights of Internet users are currently viewed as averse to efforts designed to shape public perception of governments, both local and international.

 

Government Takedown Requests Increase

As mentioned, the issue with this new era of Internet monitoring lies not just in limitation of our freedoms of speech, but in the presumed authority that local and national governments have to censor content on the web. What was once thought to be a public domain, where anyone could post anything provided it respected the boundaries of international law, has become a curious combination of created content and concerted takedowns.

A recent Google transparency report tells the tale better than any news narrative could. According to the company’s blog post on the data, Google received 3,486 take down requests regarding 24,737 pieces of content between January and June 2013; a 68% increase over the same period in 2012. Additional data provided shows the trend is rising exponentially, with approximately 2,000 requests in 2011, 2,500 in 2012, and nearly 4,000 in 2013.

 

A Troubling Trend

What’s more stark about this data than the sheer volume of requests is the nature of such requests. According to Google, takedown orders were most often connected to stories about local government dealings and content critical of local and national governments. More often than not, these requests fell under the category of “defamation”, while some were even claimed to be copyright.

The trend is startling. With more governments ordering more takedowns of critical content, the aim is clear: censorship for the preservation of public perception. Attempts to curtail public expression, particularly in the realm of government criticism, represents an unfortunate turn away from transparency and toward the limited exchange of productive, albeit challenging conversation.

 

A Powerful Ally

Fortunately, while takedown requests continue to rise, Google’s established policy against censorship of the Internet represents a valuable ally in the protection of free speech online. The data cited earlier also features a list of US and international takedown requests, and whether or not those requests were met with compliance. According to data provided, the compliance rate for these requests has fallen dramatically from 2010 to now, likely in recognition of the danger of Internet censorship.

While this practice of attempting to silence critical voices may not seem like a big deal for your business or personal blog, the implications are farther-reaching than you may realize. The power of the Internet lies in the free exchange of ideas, allowing for meaningful conversation that raises profound and important ideas and institutions to the top. This process is what breeds innovation, disrupts deleterious practices, and enriches society as a whole.

Fortunately, Google’s transparency report shows that those in favor of a free and unedited Internet have a powerful ally and a strong ideological foundation on their side. Government takedown requests continue to rise, but those wishing to preserve the core of what makes the Internet such a powerful tool, are not going down without a fight.

Google Hummingbird 101: 5 Things You Need To Know

Written by Taylor Hawes

Monday, December 16th, 2013

hummingbird

Google has changed. The Internet has changed. The combination, while putting the responsibility on the search giant’s algorithms, fundamentally affects how we shape our efforts at content discovery, simply because of the ubiquity of Google’s use. This change can be scary, but knowing how the change works, what to expect, and how it affects you will make all the difference as these revisions hit your site. In this post, we’re outlining 5 things you need to know about Google Hummingbird.

 

1. The Search Query Has Changed

In the beginning, search engines indexed information based on a rather primitive method of keyword indexing. These indexes did not understand human language, they simply represented an amalgamation of terms associated with locations, weighted by popularity and inbound links. In order to appease this format, those searching for information were required to truncate full, intelligent sentences into keywords and phrases that rubbed the algorithm the right way. Doing so would yield results, but with limited success.

Hummingbird throws that playbook out the window. Many of the old factors still exist, including keywords and PageRank, but these contribute to a formula that accounts for 200 different factors when returning results. In doing so, the engine works to incorporate long-form queries and human speech patterns to influence the relevance and quality of search results. What this means for you: no longer will your pages be judged simply on primitive factors. Relevant, original, and interesting information is, for the first time, being revealed and shuttled forth to interested eyes in dynamic new ways.

 

2. Blame Fluff For The Changes

The changes are not baseless; this isn’t simply a revision for revision’s sake. Google’s efforts are born of an era of Internet content where traditional methods could be exploited, placing unoriginal, uninteresting, and un-engaging, though keyword dense, content in front of curious viewers to the detriment of their search efforts and the reputation of websites offering compelling work.

 

3. Hummingbird Works in a Series

In this fact lies, perhaps, the greatest change to Google’s underlying engine. Previously, queries were submitted and results were returned based on a number of factors. However, each query represented a new effort, effectively limiting the ability to drill-down information when further insight was sought. The Hummingbird engine takes a new approach to the process of search, incorporating human behavior as a central tenant.

Continued searches are now viewed with a combination of order and context based on previous searches. If this sounds confusing, here’s a breakdown: each search in a series is understood by the engine in a different way. Initial queries are viewed as browsing, offering surface information and broad responses. A follow-up search related to the topic reveals more in-depth information. This series continues, retrieving information to a greater degree of specificity based on the search order and length of specific queries. In doing so, the engine emulates the human research process, seeking broad concepts and then working down to the details, in order to facilitate knowledge acquisition.

For commercial firms, this procedural search opens new doors for information previously buried in the hierarchies of corporate websites. Until recently, pages needed to have carefully crafted keywords to delineate their use as a more robust and authoritative resource. However, the series now cuts the guesswork out of the process. Those searching for “umbrellas” will receive several firms, delivered and ranked. A further search of “canvas umbrellas” will offer product pages and information matching the description, understanding the greater refinement of the request. Another search for “waxed, canvas umbrellas for under $100″ will narrow product recommendations and provided information, comprehending that, at this point in your journey, you are likely ready to buy a specific product. Beyond this step in the funnel lies information for present customers involving tech specifications, how-to instructions, and maintenance references, just to name a few.

 

4. Original, Informative Content is the Future

This series of steps and refinement of keyword comprehension means one thing: original, engaging content is the future. No longer are rote, keyword dense answers aimed at currying site traffic the ringleaders. In particular, Hummingbird favors authoritative, information-rich sources that piggyback off of Google Plus authorship and publisher-ship to tailor results to a fatigued and discerning public.

Since the engine is based on the promise of delivering answers to questions, this, above all else, should drive future content efforts. Offer FAQ pages, Q&A blog content, how-to posts, and interviews that focus on questions and answers to assert your authority in a particular avenue. Offer industry debates and “ask the expert” posts in order to drive your traffic as a firm that offers valuable information. In all things, remember that users are asking questions. Your job is to have the answers.

 

5. SEO is Evolving

In this way, SEO isn’t disappearing, but, instead, evolving. As mentioned, Google’s revision comes largely at the behest of users desiring to find more relevant content, tired from disappointing front-page entries that simply “played the game”. Traditional methods of link-mining, keyword stuffing, and cheap, overly sensationalist titles will receive less reward than ever before.

In place of these methods is a combination of traditional keywords and long-tail keywords. When embedding information in your page, your prior expertise in researching relevant keywords will still play a part, but stuffing the box will not. Simply focus on integral terms that hone your page down to its proffered expertise and value. In addition to these one-word, keywords, incorporate longer terms that effectively answer questions. In particular, observe the algorithms treatment of single keywords as indicative of broad information, 2-3 word-length keywords as more in-depth research and learning, 3-4 word-length keywords as detailed information, and 4+ word-length keywords as specialist information for customers and experts.

Hummingbird’s changes are unlikely to lose you traffic, but the science behind search engines has changed profoundly, necessitating adaptation. Gone are the days of gaming the system and here is an era of authority and originality. Series of queries will yield more robust results, as unearthing helpful content and answers are the goal. Optimize your site for the new format by including single-word terms and longer, more robust keywords in tandem. The combination may hurt impostors, but as a genuine vendor of valuable information, consider a ticker-tape parade and a bottle of Champagne.

qwaszxerdfcv3.14 | 1776zxasqw!!