Author's Opinion

The views in this column are those of the author and do not necessarily reflect the views of iTWire.

Have your say and comment below.

Wednesday, 30 January 2008 17:00

The Linux kernel: now and then

When Linus Torvalds was writing his kernel way back in the 1990s, he was working in a small bedroom in an average house in Finland and using hardware that wasn't exactly top of the range.

Torvalds was also using his own money to do what he loved.

Sixteen years and some after the release of the first version of Linux (the kernel) in 1991, development is largely done in air-conditioned rooms and funded by a variety of sources, with big companies figuring prominently.

These companies hire kernel developers and allow them to work on the kernel for either some, or else part, of their working hours.

Needless, to say all these companies have a stake in the kernel - in other words, Linux translates into dollars for them.

The editor of Linux Weekly News, Jonathan Corbet, who presents what he calls The Kernel Report (this is the fourth time he's done it in Australia), provided plenty of interesting details about the kernel in his talk at the Australian national Linux conference today.

Apart from his editorial activities, Corbet himself is an active kernel contributor.

Among the companies that fund contributions to the kernel in the manner described, Red Hat, the premier Linux company, stands first with 11 per cent. A larger percentage is funded by individuals (17 per cent).

The Linux Foundation, which employs Torvalds, funded 2 per cent of the work.

Other well-known companies involved in funding included IBM (8 per cent), Novell (7 per cent), Intel (4 per cent), Oracle (2 per cent), Google (1 per cent) and SGI (1 per cent).

Corbet also charted the development periods from the first kernel to the 2.6.0 branch when major releases were made - the last change from 2.4.0 to 2.6.0 took nearly three years, whereas development time for earlier major versions was less.

These days, point releases are made and these happen much sooner. For instance, 2.6.24 has just been released and hence it is likely, given the existing schedule, that 2.6.25 would be released sometime in April, Corbet said.

The release cycle now generally includes a two-week period when changes are merged; there is then an eight to 12-week period when stabilisation work is carried out.

These days, given the number of developers contributing patches, there are a huge number that go in during each merge period, Corbet said.

The current merge window will close about two weeks from now, the delay being because Torvalds is attending the conference.

In April 2005, the kernel development was moved from a proprietary source code management system called BitKeeper to a system created by Torvalds and others named git.

From the time the git repository was used, Corbet said, there had been more or less a staircase pattern of development with the system of the two-week merge window beginning in April 2005.

And, to give an idea of the amount of work that goes in, he said that from 2.6.20 rc3 to 2.6.20 rc7, there were changes made to two million lines of code with 750,000 lines being added.

Looking at 2007, he said that 1900 developers had contributed code to the kernel and nearly 200 companies had been involved though only 11 contributed more than 1 per cent of the total.

WEBINAR event: IT Alerting Best Practices 27 MAY 2PM AEST

LogicMonitor, the cloud-based IT infrastructure monitoring and intelligence platform, is hosting an online event at 2PM on May 27th aimed at educating IT administrators, managers and leaders about IT and network alerts.

This free webinar will share best practices for setting network alerts, negating alert fatigue, optimising an alerting strategy and proactive monitoring.

The event will start at 2pm AEST. Topics will include:

- Setting alert routing and thresholds

- Avoiding alert and email overload

- Learning from missed alerts

- Managing downtime effectively

The webinar will run for approximately one hour. Recordings will be made available to anyone who registers but cannot make the live event.



Security requirements such as confidentiality, integrity and authentication have become mandatory in most industries.

Data encryption methods previously used only by military and intelligence services have become common practice in all data transfer networks across all platforms, in all industries where information is sensitive and vital (financial and government institutions, critical infrastructure, data centres, and service providers).

Get the full details on Layer-1 encryption solutions straight from PacketLight’s optical networks experts.

This white paper titled, “When 1% of the Light Equals 100% of the Information” is a must read for anyone within the fiber optics, cybersecurity or related industry sectors.

To access click Download here.


Sam Varghese

website statistics

Sam Varghese has been writing for iTWire since 2006, a year after the site came into existence. For nearly a decade thereafter, he wrote mostly about free and open source software, based on his own use of this genre of software. Since May 2016, he has been writing across many areas of technology. He has been a journalist for nearly 40 years in India (Indian Express and Deccan Herald), the UAE (Khaleej Times) and Australia (Daily Commercial News (now defunct) and The Age). His personal blog is titled Irregular Expression.



Recent Comments