Google Tops the list of “Nielsen’s Tops of 2011: Digital”

According to the report from Nielsen released Dec 28th, 2011, Google was the most-visited US online brand with over 153 Million unique visitors per month, Facebook came in second and Yahoo came in third. YouTube led the way as the top US online destination for video.

Check out the rest of the results: http://blog.nielsen.com/nielsenwire/online_mobile/nielsens-tops-of-2011-digital/

Top 10 U.S. Web Brands in 2011
Rank Web Brand Avg # of Unique Visitors (000)per month
1 Google 153,441
2 Facebook 137,644
3 Yahoo! 130,121
4 MSN/WindowsLive/Bing 115,890
5 YouTube 106,692
6 Microsoft 83,691
7 AOL Media Network 74,633
8 Wikipedia 62,097
9 Apple 61,608
10 Ask Search Network 60,552

Read more at blog.nielsen.com
Source: Nielsen
Data from January – October 2011, Home and Work Computers. Ranked on average monthly unique audience.
Read as: During 2011, 153.4 million U.S. people, on average, visited Google sites from home and work computers

Advertisements

7 Social Media Trends for 2011

Great Article by Heidy Cohen!

Amplify’d from heidicohen.com

7 Social Media Trends for 2011

2010 was a tipping point for social media: it changed how we market, as shown by Pepsi’s Refresh Campaign and Old Spice’s viral videos; connected us during crises, notably in the Haitian earthquake and BP oil spill; and it left every firm feeling vulnerable to a PR flare up, regardless of how broad their engagement. In terms of sheer size, 2010 was notable in that Facebook overtook Google in the number of site visitors.

What does this mean for marketers as we enter 2011?

Marketers and, more importantly, senior management need to take social media seriously and to integrate it across their enterprises. It’s critical to understand that social media networks are where consumers and the public spend their time and engage. This is real life where your audience decides if they like your product and how you’re behaving as a company. Be warned that if they don’t like what your firm is doing, they have the megaphones and connections to get the message out quickly to like-minded individuals.

7 social media trends for 2011

Here are seven predictions for social media’s evolution in 2011.

  1. Think social media boy scouts: Be prepared to respond to your customers and the public. Regardless of how active your company is on social media platforms, you must be ready for a social media flare up. In 2010, a significant crisis for BP turned worse when the CEO talked about getting back to his personal life. To this end, build your social media tribe early and have a crisis management plan in place. Further, update it regularly to ensure that you’re able to contact people when you have to. In today’s world, upset customers express themselves to a broad audience, often when you’re least prepared.

  1. Get senior management on board for social media activities. Many members of senior management haven’t bought into social media. Now’s the time to get your team trained and engaged. Have them participate on social media platforms before you have to overcome an escalating social media issue. Ford avoided a social media crisis by participating as an active member of the community not just blasting out one-way messages. Management buy-in is critical to drive your social media activities towards your corporate goals. Don’t overlook the need to educate your senior management and others within your organization.

  1. Not for marketing only! Expand social media usage across the enterprise. Social media can be leveraged to cost efficiently extend the effectiveness of your organization. For example, social media can extend your customer service, human resources and investor relations by allowing a broader group of people to participate.

  1. Protect your firm, your employees and your customers with corporate guidelines. Social media guidelines can be short and to the point. While this may sound like a nuisance, it supports your employees by telling them what’s acceptable and what’s not. It takes away the guesswork. By doing this, you can enable a broader base of employees to participate in social media on your firm’s behalf and enrich your content offering.

  1. Integrate social media marketing into your overall marketing plan. To enhance the effectiveness of your social media marketing, it’s critical to integrate it into your overall marketing strategies. Well-executed social media marketing requires more than a few tweets a day and a Facebook page. Remember, social media is a multi-directional communications tool. You need to leverage other forms of marketing to let your prospects, customers and the public know about your social media efforts. Often this requires marketing and PR support. Further, bear in mind that advertising on social media platforms, particularly Facebook and Twitter, will gain traction and leverage internal information to generate revenue.

  1. Acknowledge that social media isn’t free. While many social media platforms allow users to interact without fees, from a corporate perspective, social media marketing isn’t free. It requires budget (read: money) and resources (read: employees or consultants). While companies are starting to hire social media experts, the real change occurs when they add internal headcount to manage the process and begin messaging from the inside.

Read more at heidicohen.com

 

Find out all you need to know about Google’s new and improved algorithm!

This week Google announced some adjustments to the famous Google Algorithm, the new parameters are designed to reward “High Quality” content and weed out the low value content that’s being distributed throughout the web.

So what does Google consider to be “High Quality”?
Here’s a list of tactics you can use to optimize your content and to make sure you are compliant with the new algorithm:

Direct from the “GOOGLE Webmaster Central” of best practices for content optimization: http://www.google.com/support/webmasters/bin/answer.py?answer=35769

Design and content guidelines
    • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
    • Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.
    • Keep the links on a given page to a reasonable number.
    • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
    • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
    • Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images. If you must use images for textual content, consider using the “ALT” attribute to include a few words of descriptive text.
    • Make sure that your <title> elements and ALT attributes are descriptive and accurate.
    • Check for broken links and correct HTML.
    • If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
Technical guidelines
    • Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
    • Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
    • Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.
    • Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it’s current for your site so that you don’t accidentally block the Googlebot crawler. Visit http://www.robotstxt.org/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you’re using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.
    • Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google’s AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.
    • If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.
    • Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.
  • Monitor your site’s performance and optimize load times. Google’s goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let’s Make The Web Faster. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.
Quality guidelines

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It’s not safe to assume that just because a specific deceptive technique isn’t included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

If you believe that another site is abusing Google’s quality guidelines, please report that site at https://www.google.com/webmasters/tools/spamreport. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts.

Quality guidelines – basic principles

    • Make pages primarily for users, not for search engines. Don’t deceive your users or present different content to search engines than you display to users, which is commonly referred to as “cloaking.”
    • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
    • Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links.
  • Don’t use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.