Tuesday, September 09, 2003

A Short History of Technology and the Web

From Jeremy Allaire's article

A Short History of Technology and the Web
Trends fuelling the growth of the Internet

Affordable personal computers.
Around 1994/95 price functionality made PC's viable household goods.

Low-cost connectivity.
TIll 1994, dial-up connectivity belonged to geeks. After 1994, large-scale connectivity infrastructure investments by AOL, Microsoft, and thousands of smaller Internet service providers (ISPs) provided an affordable means to get online.

Ubiquity of LAN .
In mid-1990s the local area network (LAN) caught on in companies.LAN's became the building blocks of Internet.

Mass-market server software.
Around 1994, mass made server platforms started emerging. Till this time, deploying a web server was a pricey, complex proposition.

Digital media creation tools.
Around mid-1990s lots of PC software applications emerged emerged for tweaking graphics, text, and audio.

Individually, each of these trends is significant. Each a force driving the growth and development of the PC and computer industry. But when combined, these trends create a growth dynamic with incredible power and thrust.


Trends for next version of the Internet

10 Trends for Internet 2.0
There appear to be at least 10 significant trends that constitute this next-generation Internet opportunity.

Broadband Rules

Broadband growth isa round 80% year over . This proves Internet has become central to our daily lives and the value of an always-on, reliable connection is compelling.

Broadband creates a massive new opportunity for media, software, and services delivered over the Internet.

Wireless - Alive and well

Mobile handset technology is now rich enough to deliver real consumer value. Unlike low-quality, low-speed WAP, the new world of smart phones includes multimedia messaging, integrated cameras and other digital media support, and software runtimes like BREW, J2ME and Macromedia Flash. It’s creating a wave of innovation and growth. The mobile landscape is now attractive and real.

Wi-Fi is growing by leaps and bounds. The level of investment, innovation, and growth in broadband wireless delivered using Wi-Fi is nearly identical to the growth in ISPs, TCP/IP, and Internet access in 1994 and 1995. New applications and ways of doing business built around wireless will emerge.

Digital lifestyle devices

Digital cameras, camcorders, and digital music players have become a part of daily life and a potential for new sources of content and new distribution channels for media.

As broadband and Wi-Fi penetrate the home market, they are instigating a new wave of innovation around Wi-Fi devices such as wireless security cameras, Internet stereos, video phones, and even Wi-Fi enabled music devices for automobiles.

Rich Clients
To many people, the level of innovation in client technology on the Internet has appeared to stall; HTML 4.0 and Internet Explorer seem to provide the platform for web experiences. In reality, innovation has moved steadily along, primarily led by the now ubiquitous adoption of rich client technology such as Macromedia Flash Player.

Rich client technology can transform the quality and boost the usefulness of Internet applications, media, and communications because it combines desktop-like experiences with the deployment and content-rich aspects of the web. And, in the coming year, Macromedia Central will extend this model further by providing a new client platform for the distribution and use of Internet software and media. Also this year, Microsoft will describe and promote its .NET client technology as a post-browser approach to Internet applications and content.

Web Services
Web services technology promises to radically change the usefulness of software in the world.

Within the next year, nearly 100% of new runtimes (client and server) will be SOAP-capable deployment platforms. This means that nearly any piece of code running anywhere in the world can invoke any other code on the network. This new model of application interoperability is affecting dozens of software markets. It provides the potential for new levels of productivity, integration inside of enterprises, and most importantly, it lays the foundation for interenterprise applications at a level we’ve never seen before.

Progressing side by side with the web services trend is the rapid adoption and popularity of microcontent formats such as RSS. Primarily used in the context of weblog or blog software, RSS and sister standards like RDF are driving the Internet towards well-structured, easily searchable and sharable data.

Unlike the 1.0 Internet, hacked together with logic and data isolated in stovepipes, web services and microcontent unlock the value of software and data and foster new economic models of cross-company interchange.

Real-Time Communications

The Internet is rapidly evolving from a one-way and text-based medium to a rich, multi-directional and real-time communications environment. Over the last several years, there has been mainstream adoption of real-time communications technology such as instant messaging in consumer and corporate settings. And while instant messaging may be a major driver for change, there’s a lot of focus on new platforms that enable real-time communication and collaboration within custom applications.

At the forefront of this innovation are Macromedia Flash Player and Macromedia Flash Communication Server. Flash Communication Server provides the first broadly available platform for building real-time collaborative user interfaces that incorporate multiway text, audio, and video as communications forms. Additionally, other software and online service companies such as Microsoft, Yahoo, IBM and AOL are providing their users with real-time communications applications.

Hosted Applications ( Apps on Tap)
The model of delivering software as a hosted service continues to gain traction. Although many dismissed this approach as a failure only a couple of years ago, rich clients’ popularity and web services’ ability to integrate hosted applications into an enterprise has promoted the adoption of hosted applications. This growth promises to transform the use of software in corporations around the world.

Big Data
Over the past several years, the price to value ratio on storage and bandwidth has improved dramatically, expanding what you can deliver to consumer PCs over broadband. PCs now have enormous amounts of plentiful storage, and, on the delivery side of the equation, it’s also economical to deliver and manage large quantities of rich media, including high-quality video. The trends on this front are only accelerating.

Paid Content
This new Internet environment is seeing the adoption of paid content as a business model for the Internet. Rich clients and broadband now make it possible to build high quality digital assets, and consumers seem willing to pay.

Unlike Internet 1.0’s 'information wants to be free' mantra, people now willingly pay to download music and to subscribe for access to quality content or games and entertainment. This shift from free to paid content is dramatic and symbolizes the maturity of the Internet as a media and commerce platform.

For example, RealOne has over 1 million paid subscribers for their video on demand Internet service. And dozens of other media brands are experimenting and seeing success. AOL has made it clear that their future is in broadband-enabled, high-quality paid services. Yahoo is experiencing robust growth in premium services, and the broadband ISPs are betting that paid content will form the next wave of profit growth, expanding the current access fee-only model. To make it easy to get into the game, Macromedia Central includes a model for creating paid content services and applications.

The Software Manufacturing Economy
Nearly every new Internet opportunity is based on shifts in the how software is manufactured and sold. The people, places, frameworks, and materials used to create and distribute software are changing dramatically:

Component-based software. The rapid adoption of Java and .NET runtime and development platforms make it possible to easily design, compose, and integrate software assets.

Open source.
Open source continues its forward march, giving developers access to low-cost software manufacturing and greater control over the code that applications are built upon. Because of this, application ISVs are opting for open source materials such as Apache Axis, Tomcat, Linux, and MySQL.

Global outsourcing.
Economic pressures are driving software companies to rely heavily on global outsourcing for software manufacturing. When combined with open source as a material and component architectures as a design model, it is becoming easier than ever to construct complex software projects overseas.

Web services and hosted applications.
Web services and hosted applications now deliver software products online, eliminating the need for packaged products. This shift redefines product sales and distribution channels for software and provides radical new economies of scale.


Friday, September 05, 2003

From the Economist

The Economist speaks....
DVD piracy may be coming soon with broadband rising and file storing becoming smarter.
How do you fight it ?

American Media Moguls are adopting the following strategies

1. Delete content after the user has “consumed” it.
2. Offer movies cheaply online
3. seeking new laws
4. Educate the young with new curriculum which teaches that swapping content is wrong.
5.Go after file swappers in court

Movielink, an online site charging $3-5 to download a movie. But the service is still “clunky

According to the Pew Internet & American Life Project, 65% of people who share music and video files online say they do not care if material is copyrighted

In the 1980s, software companies used to fight online pirates with DRM technology. But they found that copy protection annoyed users, and got rid of it. The makers of Lotus 1-2-3, abandoned it after finding that they had merely created a new market for software that could defeat copy protections. Now the music industry is realising that often some of the downloaders it labels as thieves are actually trying out music before they buy it, and that controlled, legal file-sharing could be a marketing tool. Viral marketing of that kind could be powerful.


Infoviz
killer applications” - the term was invented 25 years ago for - the spreadsheet. That was the reason why many people bought their first PC. It allowed them to build models and play with their data. With spreadsheets, “what if” scenarios could be calculated and recalculated easily. If the value in one cell was changed, the data in related cells were automatically adjusted. Users can, “converse with the data”.

information visualisation is all about making data visible—or, more precisely, the patterns that are hidden in the data. Graphic aids such as charts have done this for ages.What is new, he and his colleagues explain in the book, “is that the evolution of computers is making possible a medium for graphics with dramatically improved rendering, real-time interactivity and cost.”


Nanotech for solving Energy crisis
Working out how to provide the world with enough energy when the population reaches 10 billion and the global energy requirement has soared from today's 14 terawatts (ie, 14m megawatts) to anything from 30 to 60 terawatts of capacity.The lack of energy he contends, is the single biggest issue facing mankind today. Certainly, some of the world's more intractable problems—war and poverty, water and food shortages, disease and pollution—are connected, in some way or other, with energy deficiency.
Short of building nuclear power-plants outside every major city, he believes that the only way out of the energy impasse is through the use of nanotechnology

SCO - All bark no bite ?
Roughly as apes and humans allegedly have common ancestors, several operating systems can trace their lineage to UNIX, including Linux. The SCO chairman claims he soon found “massive and widespread violations” of Caldera's intellectual property in the Linux code

What most bothers the open-sourcers is SCO's refusal to reveal which lines of code it considers problematic. “Here are these people who claim we are pirates but refuse to say where and how,” says Bruce Perens, an open-source evangelist. After all, he says, remedying the situation would be “trivially easy”. The Linux “community”—numberless hobby hackers—would simply converge on the code and rewrite it within hours or days.

SCO has caused enough uncertainty that technology consultancies, such as Gartner and Yankee Group, are advising clients to wait and see before adopting Linux. It has not gone unnoticed that Microsoft is one of the few companies that has actually paid SCO for a Linux licence, even though Microsoft has no use for one. Microsoft and SCO vehemently deny that they are in league, but most open-sourcers assume that the evil Redmond giant is bankrolling a mercenary.


Backing up everything ever published

How do you ensure that readers will still be able to access electronic academic journals even centuries after they have been published ?

The project, called LOCKSS (short for “lots of copies keep stuff safe”), addresses a vexing problem that librarians face everywhere. Increasingly, academic journals are published online; many are not even available in print. As a result, libraries are losing the option of maintaining local collections—but are leery of discontinuing paper subscriptions.

looked long and hard at what the great libraries of the world have done over the millennia. First, they acquire copies and make them available to their local readers, while seeking to preserve them to the best of their ability. But if copies get lost or destroyed, they also lend them to each other. It is these circulating collections—which in effect form a peer-to-peer network with no central authority—that LOCKSS seeks to mimic.

Efficient Wind Power generation

Wind powered Turbines must be able to generate electricity at a cost that is competitive with fossil-fuel sources. One way to do this is to make cheaper windmills. Until now, large-scale wind turbines have faced into the wind. That makes them easier to design but heavy. Because the wind blows the turbine blades towards the supporting structure, they have to be made stiff enough to stop them bending and hitting the tower.

If the whole contraption could be turned around, and the fan placed downwind from the support pole, this problem would disappear. The blades could then be less stiff, and would therefore be lighter and up to 25% cheaper. So why, throughout history, have windmills always pointed upwind rather than downwind? The answer is that downwind turbines are tricky to design and subject to all sorts of aerodynamic interference caused by the supporting tower

The main problem that the WTC had to solve was how to damp the vibrations caused when a blade passes through the “wind shadow” of the tower. Calculations suggest that downwind power could be generated on the site for about 3.5 cents per kilowatt-hour—ie, competitive with coal

Bumping against the built in speed limits of the Net
How do you make the internet go faster than laying bigger, faster data pipes? There turns out to be a fundamental speed limit built into the internet's software foundations—the “transmission control protocol”, better known as TCP. The speed limit only becomes apparent at very high transmission speeds, measured in the hundreds of megabits per second (Mbps). , the efficiency of the connection was less than 30%. Why?

The problem stems from the way that TCP responds to congestion. The internet has been able to scale up from millions to billions of users over the past few years due to the simplicity of its design. Computers talk to each other in TCP using a simple rule to ensure that they make good use of available network capacity. One computer sends a chunk of data, called a packet, to another computer, and waits for an acknowledgment message, or ACK. If no ACK arrives, the sending computer assumes that the network is congested and the original packet has been lost, and scales back its transmission rate to half of the previous one. Once reliable transmission has been resumed, the sender gradually starts to increase the transmission rate, until eventually the network becomes congested again, the rate is halved, and so on. The advantage of this simple approach is that millions of computers can share a network with no need for centralised traffic control. When capacity is available, transmission speeds go up; when it is not, they go down.

This approach works well on today's internet, which is a bewildering patchwork of different networks operating at different speeds. But difficulties arise when the bottlenecks in the internet are removed, as they are on the high-speed links used by scientists. The problem, says Dr Low, is that TCP reduces the transmission rate too drastically at the first sign of congestion, and only increases speed again gradually. It is, he says, akin to a driver who can see only ten metres in front of his car, and who increases speed gradually when the road seems clear, but slams on the brakes as soon as another car comes into view. “On a slow street it may work, but on a superhighway it does not,” he says

So Dr Low and his team have devised a tweaked version of TCP, called FAST. Like the original TCP, it is a decentralised system: each computer monitors the responses to sent packets in order to adjust transmission speed in the face of varying levels of congestion. But FAST does more than simply check to see if an ACK has arrived for each packet sent. Instead, it takes into account the delay between the packet's transmission and the arrival of the corresponding ACK from the recipient. Calculating a running average of this delay time provides far more precise information about the congestion.

Transmission speed can then be adjusted carefully, smoothly scaling back when the first signs of congestion appear, and quickly ramping up again once the congestion has eased. Using FAST, Dr Low and his colleagues were able to improve the efficiency of a 1,000 Mbps link so that it reached 95%—even in the presence of a small amount of background traffic from other users. In other words, the protocol is not just fast, but backwards compatible. Computers speaking FAST can share a network with other machines using standard TCP.

E Learning Stats

The Indian e-learning market is still in its infancy at just $4 million as per the 2002 estimates, with an expected 4-year CAGR of 20-25 per cent, the apex body pointed out

The worldwide corporate e-learning market is expected to reach revenues of $23 billion by 2005, with a compounded annual growth rate (CAGR) of 25 per cent, said National Association of Software and Services Companies (Nasscom) in its latest survey

Nasscom said that there will be a significant shift in demand for e-learning content. It is expected that by 2004, non-IT content will be the larger market, accounting for over 54 per cent of total revenues.

In terms of e-learning solutions, it is estimated that two-thirds of the global market is for off-the-shelf products and the main markets are the US and the UK. North America holds the maximum opportunity with two-thirds of worldwide revenues through 2005. However, Western Europe is the fastest growing market.

At the height of the dotcom madness, some forecasters were predicting the e-learning market might be worth as much as £7bn by 2005. The reality will be more like £2bn to £3bn, suggests Mr Horseman. This, though, is a sizeable market and demand is still growing. Just last month, Jobs & Money reported how the NHS is embracing e-learning to encourage its 500,000 staff to become more computer literate. Walk into most big name organisations and you'll probably find some sort of e-learning offering. In many areas, e-learning does have an edge over conventional training. Its flexibility, if managed right, can be an advantage. The fact you can test yourself and assess what you are learning as you go, and then go back to update your learning regularly, are other plus points

"Live" or recorded broadcasts of lectures are "dead" compared to edited, bite-sized modules which distil the essence of lessons and make them more fun by using tools such as Macromedia Flash animations, audio-on-demand and Web links.

Wednesday, September 03, 2003

Geek Quotes

Adding manpower to a late software project makes it later.
When all else fails, read the instructions.

Every task takes twice as long as you think it will take. If you
double the time you think it will take, it will actually take four
times as long.

There is always one item on the screen menu that is mislabeled and
should read "ABANDON HOPE ALL YE WHO ENTER HERE."

Do not believe in miracles. Rely on them.

Blessed is the end user who expects nothing, for he/she will not be
disappointed.


At the source of every error which is blamed on the computer you
will find at least two human errors, including the error of blaming
it on the computer.

Any system which depends on human reliability is unreliable.

Undetectable errors are infinite in variety, in contrast to
detectable errors, which by definition are limited.

Investment in reliability will increase until it exceeds the
probable cost of errors, or until someone insists on getting
some useful work done.

The first myth of management is that it exists.

Any given program, when running, is obsolete.
If a program is useful, it will have to be changed.
If a program is useless, it will have to be documented.
Any given program will expand to fill all available memory.
The value of a program is proportional to the weight of its
output.
Program complexity grows until it exceeds the capability of the
programmer who must maintain it.


Inside every large program is a small program struggling to get out.


There's never time to do it right, but always time to do it over.

If there is a possibility of several things going wrong, the one
that will cause the most damage will be the one to go wrong.


The first ninety percent of the task takes ninety percent of the
time, and the last ten percent takes the other ninety percent.


The man who can smile when things go wrong has thought of someone
he can blame it on.

An ounce of image is worth a pound of performance.

Judgement comes from experience; experience comes from poor judgement.

Build a system that even a fool can use, and only a fool will want to
use it.

Given any problem containing N equations, there will be N+1
unknowns.

An object or bit of information most needed will be least
available.

Any device requiring service or adjustment will be least
accessable.

In any human endeavor, once you have exhausted all possibilities
and fail, there will be one solution, simple and obvious, highly visible
to everyone else.

Badness comes in waves.


After designing a useful routine that gets around a familiar "bug"
in the system, the system is revised, the "bug" is taken away, and
you're left with a useless routine.

Efforts in improving a program's "user friendliness" invariably
lead to work in improving user's "computer literacy."

That's not a "bug," that's a feature!

An expert is a person who avoids the small errors while sweeping on
to the grand fallacy.

"You can use an eraser on the drafting table or a sledgehammer on the construction site." Frank Lloyd Wright