Snappy, The HostGator Mascot

Gator Crossing

The Official HostGator Company Blog!

Dragonfly
AirPlane

Web Hosting News

Heartbleed Bug

Written by Sean Valant

Thursday, April 10th, 2014

You may have now heard of the “Heartbleed Bug.” Before we continue, we want to reassure you that if you are hosting on a HostGator shared or reseller server, that your server has already been patched. For everyone else, HostGator customer or not, we have created the following tool to assist you with determining whether or not your site is presently vulnerable and what further action to take, if necessary: https://heartbleed.hostgator.com/

heartbleed bug

Now, what exactly is the Heartbleed Bug? Technically speaking, it is a serious vulnerability in the popular OpenSSL cryptographic software library. In layman’s terms, it allows the ever-present nefarious individuals the ability to intercept and decode encrypted data. The following quote comes from heartbleed.com:

“The Heartbleed bug allows anyone on the Internet to read the memory of the systems protected by the vulnerable versions of the OpenSSL software. This compromises the secret keys used to identify the service providers and to encrypt the traffic, the names and passwords of the users and the actual content. This allows attackers to eavesdrop on communications, steal data directly from the services and users and to impersonate services and users.”

The bug is so-named due to a normal function between two computers across a network (such as the Internet) sharing an encrypted connection. The “heartbeat” is simply a pulse, or packet of information, sent from one machine to the other to ensure the connection still exists. This functionality is what allows the exploit to occur, in that the heartbeat is simulated by a third party in such a way as to allow them access to the memory of the receiving server.

What this translates to is virtually unlimited, and untraceable, access to a myriad of private information which potentially can include usernames, passwords, and even credit card information. The full extent of the situation is not presently known. What is known is that we should all consider all of our passwords to be compromised. As a result, you absolutely want to update any passwords for anything and everything you log into online. However, if you change your password for an account on a server that has not been patched, then you can consider the new password compromised as well.

For full information regarding this situation, we recommend reading the associated Wikipedia article.

Google Transparency Report: Government Removal Requests Continue to Rise

Written by Taylor Hawes

Wednesday, January 22nd, 2014

Google Transparency Report Government Removal Requests Continue to Rise

Revelations surrounding government monitoring of heavily populated data streams and communications channels are on everyone’s mind as such news introduces a sea change of perception about our privacy rights. The simple fact is, our browsing habits and published content are no longer as “free” as they were once understood to be.

But even beyond our the limitation of our own expression, what’s most concerning about these developments is the new and unprecedented level of authority governments are attempting to exercise over the world’s most important communications medium. With Google’s report as an accurate barometer, the situation is clear: the rights of Internet users are currently viewed as averse to efforts designed to shape public perception of governments, both local and international.

 

Government Takedown Requests Increase

As mentioned, the issue with this new era of Internet monitoring lies not just in limitation of our freedoms of speech, but in the presumed authority that local and national governments have to censor content on the web. What was once thought to be a public domain, where anyone could post anything provided it respected the boundaries of international law, has become a curious combination of created content and concerted takedowns.

A recent Google transparency report tells the tale better than any news narrative could. According to the company’s blog post on the data, Google received 3,486 take down requests regarding 24,737 pieces of content between January and June 2013; a 68% increase over the same period in 2012. Additional data provided shows the trend is rising exponentially, with approximately 2,000 requests in 2011, 2,500 in 2012, and nearly 4,000 in 2013.

 

A Troubling Trend

What’s more stark about this data than the sheer volume of requests is the nature of such requests. According to Google, takedown orders were most often connected to stories about local government dealings and content critical of local and national governments. More often than not, these requests fell under the category of “defamation”, while some were even claimed to be copyright.

The trend is startling. With more governments ordering more takedowns of critical content, the aim is clear: censorship for the preservation of public perception. Attempts to curtail public expression, particularly in the realm of government criticism, represents an unfortunate turn away from transparency and toward the limited exchange of productive, albeit challenging conversation.

 

A Powerful Ally

Fortunately, while takedown requests continue to rise, Google’s established policy against censorship of the Internet represents a valuable ally in the protection of free speech online. The data cited earlier also features a list of US and international takedown requests, and whether or not those requests were met with compliance. According to data provided, the compliance rate for these requests has fallen dramatically from 2010 to now, likely in recognition of the danger of Internet censorship.

While this practice of attempting to silence critical voices may not seem like a big deal for your business or personal blog, the implications are farther-reaching than you may realize. The power of the Internet lies in the free exchange of ideas, allowing for meaningful conversation that raises profound and important ideas and institutions to the top. This process is what breeds innovation, disrupts deleterious practices, and enriches society as a whole.

Fortunately, Google’s transparency report shows that those in favor of a free and unedited Internet have a powerful ally and a strong ideological foundation on their side. Government takedown requests continue to rise, but those wishing to preserve the core of what makes the Internet such a powerful tool, are not going down without a fight.

Google Hummingbird 101: 5 Things You Need To Know

Written by Taylor Hawes

Monday, December 16th, 2013

hummingbird

Google has changed. The Internet has changed. The combination, while putting the responsibility on the search giant’s algorithms, fundamentally affects how we shape our efforts at content discovery, simply because of the ubiquity of Google’s use. This change can be scary, but knowing how the change works, what to expect, and how it affects you will make all the difference as these revisions hit your site. In this post, we’re outlining 5 things you need to know about Google Hummingbird.

 

1. The Search Query Has Changed

In the beginning, search engines indexed information based on a rather primitive method of keyword indexing. These indexes did not understand human language, they simply represented an amalgamation of terms associated with locations, weighted by popularity and inbound links. In order to appease this format, those searching for information were required to truncate full, intelligent sentences into keywords and phrases that rubbed the algorithm the right way. Doing so would yield results, but with limited success.

Hummingbird throws that playbook out the window. Many of the old factors still exist, including keywords and PageRank, but these contribute to a formula that accounts for 200 different factors when returning results. In doing so, the engine works to incorporate long-form queries and human speech patterns to influence the relevance and quality of search results. What this means for you: no longer will your pages be judged simply on primitive factors. Relevant, original, and interesting information is, for the first time, being revealed and shuttled forth to interested eyes in dynamic new ways.

 

2. Blame Fluff For The Changes

The changes are not baseless; this isn’t simply a revision for revision’s sake. Google’s efforts are born of an era of Internet content where traditional methods could be exploited, placing unoriginal, uninteresting, and un-engaging, though keyword dense, content in front of curious viewers to the detriment of their search efforts and the reputation of websites offering compelling work.

 

3. Hummingbird Works in a Series

In this fact lies, perhaps, the greatest change to Google’s underlying engine. Previously, queries were submitted and results were returned based on a number of factors. However, each query represented a new effort, effectively limiting the ability to drill-down information when further insight was sought. The Hummingbird engine takes a new approach to the process of search, incorporating human behavior as a central tenant.

Continued searches are now viewed with a combination of order and context based on previous searches. If this sounds confusing, here’s a breakdown: each search in a series is understood by the engine in a different way. Initial queries are viewed as browsing, offering surface information and broad responses. A follow-up search related to the topic reveals more in-depth information. This series continues, retrieving information to a greater degree of specificity based on the search order and length of specific queries. In doing so, the engine emulates the human research process, seeking broad concepts and then working down to the details, in order to facilitate knowledge acquisition.

For commercial firms, this procedural search opens new doors for information previously buried in the hierarchies of corporate websites. Until recently, pages needed to have carefully crafted keywords to delineate their use as a more robust and authoritative resource. However, the series now cuts the guesswork out of the process. Those searching for “umbrellas” will receive several firms, delivered and ranked. A further search of “canvas umbrellas” will offer product pages and information matching the description, understanding the greater refinement of the request. Another search for “waxed, canvas umbrellas for under $100″ will narrow product recommendations and provided information, comprehending that, at this point in your journey, you are likely ready to buy a specific product. Beyond this step in the funnel lies information for present customers involving tech specifications, how-to instructions, and maintenance references, just to name a few.

 

4. Original, Informative Content is the Future

This series of steps and refinement of keyword comprehension means one thing: original, engaging content is the future. No longer are rote, keyword dense answers aimed at currying site traffic the ringleaders. In particular, Hummingbird favors authoritative, information-rich sources that piggyback off of Google Plus authorship and publisher-ship to tailor results to a fatigued and discerning public.

Since the engine is based on the promise of delivering answers to questions, this, above all else, should drive future content efforts. Offer FAQ pages, Q&A blog content, how-to posts, and interviews that focus on questions and answers to assert your authority in a particular avenue. Offer industry debates and “ask the expert” posts in order to drive your traffic as a firm that offers valuable information. In all things, remember that users are asking questions. Your job is to have the answers.

 

5. SEO is Evolving

In this way, SEO isn’t disappearing, but, instead, evolving. As mentioned, Google’s revision comes largely at the behest of users desiring to find more relevant content, tired from disappointing front-page entries that simply “played the game”. Traditional methods of link-mining, keyword stuffing, and cheap, overly sensationalist titles will receive less reward than ever before.

In place of these methods is a combination of traditional keywords and long-tail keywords. When embedding information in your page, your prior expertise in researching relevant keywords will still play a part, but stuffing the box will not. Simply focus on integral terms that hone your page down to its proffered expertise and value. In addition to these one-word, keywords, incorporate longer terms that effectively answer questions. In particular, observe the algorithms treatment of single keywords as indicative of broad information, 2-3 word-length keywords as more in-depth research and learning, 3-4 word-length keywords as detailed information, and 4+ word-length keywords as specialist information for customers and experts.

Hummingbird’s changes are unlikely to lose you traffic, but the science behind search engines has changed profoundly, necessitating adaptation. Gone are the days of gaming the system and here is an era of authority and originality. Series of queries will yield more robust results, as unearthing helpful content and answers are the goal. Optimize your site for the new format by including single-word terms and longer, more robust keywords in tandem. The combination may hurt impostors, but as a genuine vendor of valuable information, consider a ticker-tape parade and a bottle of Champagne.

NSA Taps Into Google, Yahoo Data

Written by Taylor Hawes

Wednesday, November 6th, 2013

Fiber optics

In May of 2013, former National Security Agency contractor Edward Snowden fled the United States with classified documentation revealing some of the most sophisticated and prolific public spying in American history. The PRISM program he divulged is an extensive campaign that utilizes classified intelligence directives to acquire “metadata” from major Internet players like Google and Yahoo. Since then, Snowden has brought to light myriad directions of similar ilk, geared toward data collection in the name of intelligence efforts.

In a recent leak, however, it was revealed that PRISMs scope pales in comparison to the NSA’s international data mining project, known by the acronym MUSCULAR and run in tandem with the British GCHQ. The program, it was shown, utilizes the linkages between Google and Yahoo data centers, mining entire data flows and shipping the data back to NSA data warehouses in Fort Meade.

The NSA program utilizes a structural flaw in the two companies’ architecture. Yahoo and Google maintain high speeds through decentralized data centers spanning multiple continents and connected by thousands of miles of fiber optic cable. In order to maximize performance, these data centers continuously sync information between repositories, including whole user accounts and email indexes.

In order to obtain the information desired, the NSA needed to circumvent exemplary data security protocols. These protocols include 24-hour guards, biometric identity verification, and heat-sensitive camera at data centers. According to the article in the Washington Post, company sources had reason to believe that their internal networks were safe.

Despite these measures, a weakness was uncovered. An internal NSA slide show leaked by Snowden contained a hand-drawn diagram outlining the transition point between Google internal networks and user computers. The drawing highlighted Google front-end servers as the weak point, noting that these servers actively decrypted information and could be exploited for data acquisition purposes.

Neither company was aware of the backdoor intrusion. Both companies acknowledge and acquiesce to front-end requests for data but maintained that their internal networks were secure. Google vice president for security engineering Eric Grosse even announced plans to encrypt linkages between data centers with the presumption of security.

Since the leak, both companies have reacted in outrage. Google’s chief legal officer, David Drummond remarked on the subject: “We have long been concerned about the possibility of this kind of snooping, which is why we have continued to extend encryption across more and more Google services and links, especially the links in the slide.” Yahoo commented: “We have strict controls in place to protect the security of our data centers, and we have not given access to our data centers to the NSA or to any other government agency.”

Legally speaking, the NSA is exploiting a loophole related to international espionage practices. While Congressional oversight has limited domestic spying, international monitoring remains less inhibited. Because the data centers of the two Internet giants span multiple continents, interception of these data flows is technically permitted under Section 702 of the FISA Amendments Act of 2008.

This international monitoring occurs with the cooperation of the British GCHQ. The UK agency maintains a data cache that can hold three-to-five days of traffic data before recycling storage. During this time, NSA software utilizes search terms in order to sift desirable data from the dredges. This data, once identified, is shipped via fiber-optic cables to the data warehouses in Fort Meade. This information, the agency claims, has produced intelligence leads against “hostile foreign governments.” At this point, this assertion of intelligence value remains largely unsubstantiated, likely due to the classified nature of such leads.

The scope of the MUSCULAR program lies in the volume of search terms used while sifting through acquired data. According to records, these inquires include 100,000 terms, more than two-fold the amount used in the PRISM program. The volume indicated in the Washington Post’s documents topped 181 million records over a 30 day period. The data acquired includes who sent or received emails, the subject of these emails, when and where they were sent, and the text, audio, and video content of these messages.

The program strikes a chord with both companies due to its unique nature. Both organizations were willing participants in the collection of data through front-end means, but the back-end intrusion remains uncharacteristically aggressive. Google, as mentioned, will move to encrypt its internal networks, however Yahoo has not indicated whether it will do the same.

The ramifications of these revelations is yet to be seen. However it is likely that, in the wake of negative public reaction to the PRISM documents, the sentiment will be similar. Ultimately, the continued exposure of agency programs continue to demonstrate the inter-connected and heavily monitored nature of our digital communications; a fact that can no longer go unacknowledged.

qwaszxerdfcv3.14 | 1776zxasqw!!