Saturday, March 31, 2012

Instant America
Created by: Online Graduate Programs

For Impatient Web Users, an Eye Blink Is Just Too Long to Wait



http://www.nytimes.com/2012/03/01/technology/impatient-web-users-flee-slow-loading-sites.html?_r=4&hp&mkt_tok=3RkMMJWWfF9wsRokuajNZKXonjHpfsXx6ustUKag38431UFwdcjKPmjr1YIATMV0dvycMRAVFZl5nQJdDfKQcIlD

The Blink of an Eye? Oh, Please

According to Harry Shum, a Microsoft computer scientist, computer users will visit a Web site less if its loading time is slower than its competitors by 250 milliseconds, or one-quarter of a second. That is less time than a single eye blink. Related Article »


For Impatient Web Users, an Eye Blink Is Just Too Long to Wait

Peter DaSilva for The New York Times
Arvind Jain, a Google engineer, pointed out the loading speed of individual elements of a website on a test application used to check efficiency, at Google offices in Mountain View, Calif.
 Wait a second.
Multimedia
No, that’s too long.
Remember when you were willing to wait a few seconds for a computer to respond to a click on a Web site or a tap on a keyboard? These days, even 400 milliseconds — literally the blink of an eye — is too long, as Google engineers have discovered. That barely perceptible delay causes people to search less.
“Subconsciously, you don’t like to wait,” said Arvind Jain, a Google engineer who is the company’s resident speed maestro. “Every millisecond matters.”
Google and other tech companies are on a new quest for speed, challenging the likes of Mr. Jain to make fast go faster. The reason is that data-hungry smartphones and tablets are creating frustrating digital traffic jams, as people download maps, video clips of sports highlights, news updates or recommendations for nearby restaurants. The competition to be the quickest is fierce.
People will visit a Web site less often if it is slower than a close competitor by more than 250 milliseconds (a millisecond is a thousandth of a second).
“Two hundred fifty milliseconds, either slower or faster, is close to the magic number now for competitive advantage on the Web,” said Harry Shum, a computer scientist and speed specialist at Microsoft.
The performance of Web sites varies, and so do user expectations. A person will be more patient waiting for a video clip to load than for a search result. And Web sites constantly face trade-offs between visual richness and snappy response times. As entertainment and news sites, like The New York Times Web site, offer more video clips and interactive graphics, that can slow things down.
But speed matters in every context, research shows. Four out of five online users will click away if a video stalls while loading.
On a mobile phone, a Web page takes a leisurely nine seconds to load, according to Google, which tracks a huge range of sites from the homes of large companies to the legions of one-person bloggers. Download times on personal computers average about six seconds worldwide, and about 3.5 seconds on average in the United States. The major search engines, Google and Microsoft’s Bing, are the speed demons of the Web, analysts say, typically delivering results in less than a second.
The hunger for speed on smartphones is a new business opportunity for companies like Akamai Technologies, which specializes in helping Web sites deliver services quicker. Later this month, Akamai plans to introduce mobile accelerator software to help speed up the loading of a Web site or app.
The government too recognizes the importance of speed in mobile computing. In February, Congress opened the door to an increase in network capacity for mobile devices, proposing legislation that permits the auction of public airwaves now used for television broadcasts to wireless Internet suppliers.
Overcoming speed bumps is part of the history of the Internet. In the 1990s, as the World Wide Web became popular, and crowded, it was called the World Wide Wait. Invention and investment answered the call.
Laying a lot of fiber optic cable for high-speed transmission was the first solution. But beyond bandwidth, the Web got faster because of innovations in software algorithms for routing traffic, and in distributing computer servers around the world, nearer to users, as a way to increase speed.
Akamai, which grew out of the Massachusetts Institute of Technology’s Laboratory for Computer Science, built its sizable business doing just that. Most major Web sites use Akamai’s technology today.
 The need for speed itself seems to be accelerating. In the early 1960s, the two professors at Dartmouth College who invented the BASIC programming language, John Kemeny and Thomas Kurtz, set up a network in which many students could tap into a single, large computer from keyboard terminals.
“We found,” they observed, “that any response time that averages more than 10 seconds destroys the illusion of having one’s own computer.”
In 2009, a study by Forrester Research found that online shoppers expected pages to load in two seconds or fewer — and at three seconds, a large share abandon the site. Only three years earlier a similar Forrester study found the average expectations for page load times were four seconds or fewer.
The two-second rule is still often cited as a standard for Web commerce sites. Yet experts in human-computer interaction say that rule is outdated. “The old two-second guideline has long been surpassed on the racetrack of Web expectations,” said Eric Horvitz, a scientist at Microsoft’s research labs.
Google, which harvests more Internet ad revenue than any other company, stands to benefit more than most if the Internet speeds up. Mr. Jain, who worked at Microsoft and Akamai before joining Google in 2003, is an evangelist for speed both inside and outside the company. He leads a “Make the Web Faster” program, begun in 2009. He also holds senior positions in industry standards groups.
Speed, Mr. Jain said, is a critical element in all of Google’s products. There is even a companywide speed budget; new offerings and product tweaks must not slow down Google services. But there have been lapses.
In 2007, for example, after the company added popular new offerings like Gmail, things slowed down enough that Google’s leaders issued a “Code Yellow” and handed out plastic stopwatches to its engineers to emphasize that speed matters.
Still, not everyone is in line with today’s race to be faster. Mr. Kurtz, the Dartmouth computer scientist who is the co-inventor of BASIC, is now 84, and marvels at how things have changed.

VirusTotal

https://www.virustotal.com/

VirusTotal is a free service that analyzes suspicious files and URLs and facilitates the quick detection of viruses, worms, trojans, and all kinds of malware.



VirusTotal

99,679,786 files

8,380,857 urls

65,621 utilizadores

665,035 comments

60,681 votes

The following charts describe file scanning and file submissions to VirusTotal over the last seven days. Wherever a given notion is not considered on a daily basis, the whole aggregate volume for the last 7 days is displayed.

Top IT skills wanted for 2012

http://www.techrepublic.com/blog/career/top-it-skills-wanted-for-2012/3503


Top IT skills wanted for 2012



Takeaway: A new Computerworld survey indicates the nine IT skills that will be in demand in 2012.
Nearly 29 percent of the 353 IT executives who were polled in Computerworld’s annual Forecast survey said they plan to increase IT staffing through next summer. (That’s up from 23% in the 2010 survey and 20% in the 2009 survey.)
Here are the skills that the IT executives say they will be hiring for:
  1. Programming and Application Development–61% plan to hire for this skill in the next 12 months, up from 44% in the 2010 survey. This covers the gamut from website development to upgrading internal systems and meeting the needs of mobile users.
  2. Project Management (but with a twist)– The twist is that they’re not going to just be looking for people who can oversee and monitor projects. They also want people who can identify users’ needs and translate them for the IT staffers-the increasingly popular business analysts.
  3. Help Desk/Technical Support–Mobile operating systems have added a new dimension to help desk and tech support.
  4. Networking-This demand is being fueled partially by virtualization and cloud computing projects. The survey also revealed that execs will be looking for people with VMware and Citrix experience.
  5. Business Intelligence-Computerworld interprets this uptick to a focus shift in many companies,  from cost savings to investing in technology. That will be nice if it pans out that way.
  6. Data Center-Virtualization and the Cloud could also be behind the increased need for IT professionals with backgrounds in data center operations and systems integration.
  7. Web 2.0-Tech skills centered around social media will be in demand, with .Net, AJAX and PHP as key back-end skills, with HTML, XML, CSS, Flash and Javascript, among others, on the front end.
  8. Security-Although down from 32 percent in the 2010 survey, security stays a top concern of IT executives.
  9. Telecommunications-The survey indicates a demand for people with IP telephony skills, and for those familiar with Cisco IPCC call center systems.

How to augment our intelligence as algorithms take over the world

http://www.smartplanet.com/blog/thinking-tech/how-to-augment-our-intelligence-as-algorithms-take-over-the-world/10588?tag=mantle_skin;content


How to augment our intelligence as algorithms take over the world

By  | March 9, 2012, 12:19 PM PST
Physicist, mathematician and Chief Technology Officer of Quid Sean Gourley will paint a picture of the not-too-distant future in his talk this weekend at a TEDx event in Mountain View, California. That picture includes a whole new ecology of beings popping into existence, fighting for survival and supremacy, and finally exhausting themselves under the inevitable force and weight of evolution. The beings that Gourley will talk about, however, are not living, breathing entities. They’re algorithms.
I caught up with Sean Gourley this week to hear his thoughts on the ecology of algorithms, predator/prey relationships, and what we as mere mortals can do in the face of computers that move at speeds far faster than our brains can perceive. The conclusions he presents are both terrifying and endlessly fascinating.
First, despite all the talk of Big Data, Gourley sees a more important trend in the rise of small, fairly dumb algorithms that happen to be very powerful. These are programs that follow information, determine patterns, and use trigger points to set new sequences of activity in motion. Right now, algorithms already control the majority of equity trades in the US financial market. Humans loose these algorithms into the wild, but they move at such a rate that we can’t follow them or even regulate these programs before they’ve moved on to the next level of market analysis, and the next batch of trading decisions.
Ecologists are now tracking financial algorithms and finding that their behavior mimics that of predator and prey animal species. One program may try to hide a large transaction, for example, by creating swarms of smaller transactions meant to fool a competing program. Sometimes the trick works, and sometimes the competing program detects the subterfuge and counterstrikes.
Beyond financial markets, the same trend is starting to take hold in the advertising world, with algorithms determining when and where advertisers should bid on available inventory. There are also retailers who use computer programs to set prices on Amazon, and a wide variety of recommendation engines relying on algorithms to decide what information should be presented to us for review.
So, should we be worried? Are the machines taking over? Is a Skynet-like authority inevitable? Gourley says there will likely be a backlash as more people realize the control we’re rapidly relinquishing to computer programs. But he also points out that a lot of things these algorithms do, aren’t things we want or have the resources to do ourselves. Take GPS software. Increasingly GPS software analyzes not only maps, but also construction work, traffic congestion and more to determine the fastest route to any destination. Perhaps we could scan the same data sources to map out our own path, but it would take us far longer, and probably wouldn’t be more effective.
Gourley suggests that the best answer to understanding and regulating complex data systems is to start augmenting our own intelligence with computing tools that make these systems human-readable. In other words, we need to create layers of abstraction so that we can fathom the activity taking place, even if it’s happening too fast for our human brains. Gourley’s company Quid designs some of these tools, creating multi-dimensional models to help people understand vast quantities of ever-changing data. Humans, he says, can manage three-dimensional models, handle the addition of a fourth dimension – time – and even comprehend a pseudo fifth dimension through size depictions of changing data conditions. (There may be a viable fifth-and-a-half dimension too in the use of granular color changes to represent different types of activity.) Once we hit six-dimensional models, however, Gourley says humans will struggle. Our sweet spot appears to be right below that level.
With sophisticated modeling tools – even up to only five and a half dimensions – Gourley believes humans can stay in control of machines. Or, at least we can point our algorithms in the right direction. Just because the world is growing more complex, he says, doesn’t mean we can’t still make decisions. We just might need a little extra help.

Google Tablet: A very good thing done the right way

http://www.zdnet.com/blog/mobile-news/google-tablet-a-very-good-thing-done-the-right-way/7310?tag=mantle_skin;content

Google Tablet: A very good thing done the right way

By  | March 30, 2012, 5:10am PDT
Summary: Google may be producing and selling its own tablet if rumors are accurate. Let’s hope it does it the right way to give it a chance to succeed.
Android tablets have failed to set the world on fire by anybody’s reckoning, and the word that Google is looking to step directly into the fray is good. According to those “close to the arrangement” Google intends to sell Google-branded tablets directly to consumers via an online store.
Critics are reminded of Google’s failed attempt to sell Nexus One smartphones the same way, and expect the tablet venture to have the same result. That is a real possibility, but if Google does this the right way it could breathe life into the flailing Android tablet space.
The rumors circulating indicate Google will partner with ASUS and Samsung to produce tablets with the Google brand. Heavy price subsidization is being thrown out as a mechanism to help move tablets in an iPad dominated market. This sounds like the avenue Google might take for this effort, but it likely won’t work if so.
What Google needs to do is step up to the plate and fully leverage its buyout of Motorola Mobility. Forget simply rebranding tablets made by other companies and develop a genuine Google Tablet in house after the merger is complete.
Google should sit down in internal meetings with the proper resources at Motorola and jointly design a real Google Tablet that can compete in the market. Google services, including the Google Play market, should be tightly integrated into the tablet at every level. Don’t worry about competing with partners tablet offerings, they aren’t selling in numbers anyway according to the partners.
Motorola and Google could build a Google Tablet that is a direct competitor to the Amazon Kindle Fire, and no matter how you look at it this is the real competition for Google. Build a tablet that looks and operates in a way that facilitates using (and buying) Google services and products. Mainstream consumers are not buying tablets based on whiz-bang features, that has already been proven. They are buying Kindle Fires by the millions to get a simple user experience for buying and consuming content from Amazon.
The Google Tablet doesn’t need to compete with the high-end Android tablets on the market currently. It needs to have a rock-solid interface with trouble-free integration with everything Google. It should be offered at a competitive price, and that means competitive with the Kindle Fire. While that likely means selling it at a loss as Amazon is believed to be doing, the Google/ Motorola effort should be able to scale to keep that loss to a minimum.
It comes down to what Google really is trying to do here. Simply selling rebranded tablets at a big loss is not going to prove anything, and probably won’t advance the Android OS in the tablet space. However if they really want a viable Google Tablet to have a legitimate chance in the market, they better produce one. Once the Motorola merger is completed later this year, Google will have an advantage controlling the entire tablet process that so far only Apple enjoys. That’s the key to the Google Tablet.
Related news:

Friday, March 30, 2012

Five free network monitoring tools

http://www.techrepublic.com/blog/five-apps/five-free-network-monitoring-tools/1342?tag=content;roto-fd-feature


Five free network monitoring tools

Takeaway: Among the wide array of network and system monitors, you’ll find several that do what the pricier tools do — for free.
If you’re a system or network administrator, you need monitoring tools. You have to know, at all times, the status of your systems so you can optimize performance and head off potential problems. Thankfully, plenty of tools are available to help you stay in the know about your systems. Some of these products are costly and do quite a lot. But others are free and do just as much — and in some cases, more. That’s right. More.
I want to introduce you to five system and/or network monitors that do more than you’d think they could do. From this list of products you will certainly find one or more tools that will serve your needs.
Note: This list is also available as a photo gallery.

1: Observium

Observium (Figure A) is “an autodiscovering PHP/MySQL/SNMP-based network monitoring [tool].” It focuses on Linux, UNIX, Cisco, Juniper, Brocade, Foundry, HP, and more. With Observium, you’ll find detailed graphs and an incredibly easy-to-use interface. It can monitor a huge number of processes and systems. The only downside is a lack of auto alerts. But to make up for that, you can set Observium up alongside a tool like Nagios for up/down alerts.

Figure A

Observium

2: Ganglia

Ganglia (Figure B) is a “scalable distributed monitoring system” focused on clusters and grids. It gives you a quick and easy-to-read overview of your entire clustered system. This monitor has been ported to many platforms and is used on thousands of clusters around the world. Anyone who employs server clusters should have Ganglia monitoring that system. Ganglia can scale to handle clusters with up to 2,000 nodes.

Figure B

Ganglia

3: Spiceworks

Spiceworks (Figure C) is becoming one of the industry standard free network/system monitoring tools. Although you have to put up with some ads, the features and Web-based interface can’t be beat. Spiceworks monitors (and autodiscovers) your systems, alerts you if something is down, and offers outstanding topographical tools. It also allows you to get social with fellow IT pros via the Spiceworks community, which is built right in.

Figure C

Spiceworks

4: Nagios

Nagios (Figure D) is considered by many to be the king of open source network monitoring systems. Although not the easiest tool to set up and configure (you have to manually edit configuration files), Nagios is incredibly powerful. And even though the idea of manual configuration might turn some off, that setup actually makes Nagios one of the most flexible network monitors around. In the end, the vast number of features Nagios offers is simply unmatched. You can even set up email, SMS, and printed paper alerts!

Figure D

Nagios

5: Zabbix

Zabbix (Figure E) is as powerful as any other network monitoring tool, and it also offers user-defined views, zooming, and mapping on its Web-based console. Zabbix offers agent-less monitoring, collects nearly ANY kind of data you want to monitor, does availability and SLA reporting, and can monitor up to 10,000 devices. You can even get commercial support for this outstanding open source product. One unique Zabbix feature is the option to set audible alerts. Should something go down, have Zabbix play a sound file (say, a Star Trek red alert klaxon?).

Figure E

Zabbix

Your choice

There are many tools available for the monitoring of systems and networks. The tool you choose could determine your ability to handle your job efficiently. Make sure you take a look at one or more of the applications above. With some unique features on offer, these tools stand out above the rest.
Do you use any of these monitoring tools? What other top contenders would you add to the list?

More on monitoring tools