Introduction
Linux OS was first created by a student from the University of Helsinki in Finland. The author's name was Linus Torvalds and he had an interest that turned into a passion for Minix, a small Unix application which later evolved into a system that exceeded the standard Minix. He began working for the MINIX in 1991 and worked heavily until 1994 when the first version of the Linux kernal 1,0 released. Linux kernal This lays the foundation on which the Linux operating system formed. Hundreds of organizations and businesses currently engaged people and used for release versions of operating systems that use the Linux kernal.operating characteristics and adaptation of Linux have made Linux and Windows OS are excellent alternatives to other operating systems. IBM and other giant companies around the world support Linux and the current work, a decade after the original version. The operating system is embedded chip using a process called "integration" and increases the efficiency of appliances and devices.
History of Linux
Through the 1990's some technical Savy computer hobby stubborn people who are interested in desktop management systems developed. These systems, including GNOME and KDE applications running on Linux is available to everyone, regardless of the motivation of persons to use the system. Linus Torvalds was interested to learn the capabilities and features a 80386 processor for changing jobs. The application was originally called Freax first used by the operating system Minix.
Both Freax Minix and plans seemed to sacrifice performance for academic research and study. Many computer specialists are now making assumptions that have changed since the early 90's. Mobility is now a common target for these specialists in the field of information and this is definitely not an academic requirement for the software. Various ports to IA-32, PowerPC, MIPS, Alpha, and ARM with supporting products made and sold to wholesalers and retailers, traders gave Linus an Alpha system where the tasks in the list of priorities Linus went to a very busy place.
History of Windows
Microsoft chairman was Bill Gates and Paul Allen shared the title until 1977, when Bill Gates became president and vice president Paul Allen. In 1978, the drives of the machines Tandy and Apple was 5.25 inches. First COMDEX computer show in Las Vegas introduced a chip 16-bit, and the manufacturers have introduced Intel 8086 chip. Al Gore comes up with the phrase "information highway." The same year, Apple cofounder Steve Wozniak developed the first programming language called Integer Basic, the language was quickly replaced by Microsoft Applesoft Basic.
Also in 1978, there was a machine that had a comprehensive, independent design and was priced at less than $ 800, also known as the Commodore PET that was a personal electronic transactor. On 4/11/78 Microsoft announces third language product, the Microsoft COBOL-80. On November 1, 1978 after the introduction of third language, opened its first international sales office in Japan. The Microsoft representatives ASCII Microsoft, locatwed Tokyo, asits exclusive sales agent for the Far East. And finally, New Year's Eve of 1978, Microsoft announced that year end sales were over $ 1 million dollars. The next year, in April 1979, the Microsoft BASIC 8080 is the first microprocessor to win the ICP Million Dollar Award. The mainframes dominated by software on the server, the recognition for the PC computer reported the development and acceptance of industry.
Both Gates and Allen returned home to Bellevue, Washington and announced plans to open offices in their hometown, and became the first microcomputer software company in the northwest.
Technical details of both Linux and Windows operating systems
An operating system takes care of all input and output are in a computer. Manages users, processes, memory management, printing, telecommunications, networks, etc. The operating system sends data to a disk, printer, monitor and other peripherals connected to the computer. A computer can not function without an operating system. The operating system tells the machine how to process instructions from input devices and software running on the computer. Therefore, each computer is built differently, or orders for production should be treated differently. In most cases, an operating system is a giant nest of programs, but instead a small system with programs running from the core or kernal. The computer's operating system the computer is so compact, these little programs that claim to be easier to rewrite parts of the system r packets rather than redesign an entire program.
When first created operating systems were designed to help applications interact with the hardware. This is the same today, the importance of the OS has increased to the point where the operating system determines the computer. The operating system transmits a layer of abstraction between the user and the machine when they communicate. Users do not see the hardware, but the view through the operating system. This pumping can be used to hide certain elements of the material from the application and user.
Applied Software is what is general, but especially for a single machine function. The software will run on any machine. Applications like this are SABRE, the reservation system for airlines, and defense systems. Computer Aided Software Engineering (CASE) Creating software is a costly and time consuming process. These programs will support and in some cases replace the engineer to create computer programs. CAD cam systems is the Computer Aided Design & computer aided manufacturing. The circuit board design in a computer program, the possibilities multiply. As premanufactured elements, load calculations, simulations of how a building will keep in earthquakes.
In Linux there is a question that has been going back and forth for a while now, is dead SCSI for workstations? There have been many advances in SATA and the mainstream acceptance of 10K RPM Western Digital Raptor maybe that made SCSI too expensive for what is needed in a job. It is time to take a look at Linux. How does the Western Digital Raptor WD740GD over the last three Ultra320 SCSI drives: the Seagate Cheetah 10K.7, Seagate Cheetah 15K.3, and Seagate Cheetah 15K.4. This section covers the technology of audio disks, heat, size and performance.
Lets take a look at the latest generation of Seagate 10K Cheetah line and 15K Cheetah line. It will also take an in depth look at the latest 10K SATA drive of the Western Digital 74GB WD740GD. Starting from the Western Digital Raptor, WD pushes this drive as a low cost of responding SCSI. On their website, they like to show off the drives 1.200.000 ώρες MTBF (mean time between failures), which matches the last MTBF generation of Seagate 15K.3 Cheetah is very close to the reliability of rating Cheetahs today.
The prospectus or the prospectus of Linux, I would mention that the unit Cheetah is designed for "high performance around the clock use." Both the Cheetah and the Western Digital Raptor drives have the same amount of memory cache. When speaking of the actions an environment multi-tasking/multi-user, in favor of the tail is a technical advantage. All Ultra 320 SCSI drives support what is called Native Command Queuing, or NCQ. This technique is where all commands sent to the drive can to be queued and reordered by the most efficient order. This stops the car from having to seek services on a single side of the disc, then goes to the other side of the disc serve another request to return the next request .. While some of the SATA drives support NCQ, the Raptor does not. The Raptor is another form of the tail called the Tagged Command Queuing or TCQ. This method is not as effective as NCQ and requires the support of the motion and central controller. From what I was able to determine, TCQ support is sparse, even under Windows.
The SATA drive is supported on the strength requirement by stating the use of fluid dynamic bearing their drivers. The fluid dynamic bearings replace ball bearings to reduce the car damage and drastically reduce operating noise.
Microsoft Windows XP technology is easy to enjoy games, music, movies and beyond the film making and enhance digital photography. Direct X 9,0 technology drives high speed multimedia and various computer games. DirectX provides breathtaking graphics, sound, music and three-dimensional animation that bring games to life. Direct X is also a link that allows software engineers to develop a game that is high speed and multimedia driven to your computer. Direct X was introduced in 1995 and its popularity is soaring as the development of multimedia applications has reached new heights. Today, Direct X has evolved into an Application Programming Interface (API) that apply to Microsoft operating systems Windows. This means software developers to access hardware features without having to write code material.
Some of the features of windows media playerb 9 Series with smart jukebox gives users more control over their music. With easy cd transfer to PC, CD and compatibility is available on portable players. Users can also discover more with services having premium entertainment. Windows Media Player 9 seriers works well with Windows XP using the built-in digital media features, and offers a state-of-the-art experience.When Windows Millennium Edition 2000 came from the stores specifically designed for home users. It was the first version of a Microsoft product video editing. Movie Maker is used to capture and organize and edit video clips, then export them to PC or play web. Movie Maker 2, released in 2003, the film adds a new transition, jazzy titles and neat special effects. Based on Microsoft Direct Show and Windows Media technologies, Movie Maker only originally included with Windows Millennium Edition. Now Movie Maker 2 is available for Windows XP Home Edition and Windows XP Professional.
With the release of Windows XP, 2001, came the Windows Messenger, bringing instant messages to users via the Internet. Users communicate with text messages in real time Windows Messenger. Real-time messages with video conferencing is available for a long time ago now. The first communication tool provided by Windows Messenger is used comprehensive, easy to use text chat, voice and video, and data collaboration.
Linux is developed and thus is free for distribution in encrypted form. Linux is developed and available via the Internet. Many of the engineers involved in the production are from over seas and never meet each other. This operating system is the source code level and on a large scale has led the way to becoming a stable and featureful system.
The Eric Raymond has written a popular essay on the development of Linux called The Cathedral. and the bazaar. Describes how to use a Linux kernal approach bazaar has released the code quickly and often, and this requires input provided by improving the system. This approach Bazaar has been reported that the Metropolis approach used by other systems such as core GNU Emacs. The approach is characterized cathedral for the pursuit of the most beautiful code has been released, but unfortunately it is released much less frequently. A poor opportunities for people outside the group can not contribute to the process.
Some high-lights and the success of projects Bazaar does not include the opening of the code for everyone to observe, in the planning of Bazaar. By the same logic approach Cathedral is widely regarded by everyone and is appropriate. When debugging the code is executed, it is necessary to open the bazaar to have everyone find several errors on the code. If they can fix the code a lot of effort and help developers.
Advantages and disadvantages of the two operating systems
The author of this web Linux OS page Chris Browne, describes how the Linux efforts and distributed some of the advantages and disadvantages of Linux OS. The Linux OS comes with some experimental versions like 2.5. x series where the version numbers go up steadily each week. The stable version only changes when errors are detected in the system and errors must be determined by the test series, and this incident does not change very often. Linux users know that this is happening, and work to resolve the errors.There is no guarantee that all users will immediately fix their problems with their systems, if not affected (or did not notice that influenced) by the problems there are fixes available quickly, sometimes distributed across the Internet after several hours of diagnosis. For Linux fixes are available more quickly than commercial vendors such as Microsoft, HP, IBM, and usually such a diagnosis is before you even know that there is a problem. This recognition contrasts with other companies conduct, Bill Gates claims the press release of Microsoft code has no bugs. This seems to mean that there are bugs that Microsoft is keen to correct.
Microsoft has concluded that the majority of errors identified in the present systems, because users do not use their software properly. The remaining problems for Microsoft are few in number and are caused by errors of fact. There is remaining work to get a stable system Linux, configured with Linux kernels that have and have configured the software over the workload of the system should run for hundreds of days without rebooting the computer. Some of the general public as well as IT professionals, such as engineers and technicians complain that Linux is always changing. Chris says that "the effort and interest kernal Linux will stop when people want to stop building and strengthening the Linux kernal." As long as new technology and devices such as video cards manufactured and people interested for Linux invent new enhancements for Linux, work on the Linux OS will progress.
The disadvantage of Linux OS is that it may end because there is a better platform for kernal hacking, Linux, or because the future will be displaced so as to become unwieldy. This has not happened yet, but many researchers say the future of Linux, with various plans to achieve service to the consumer or enterprise, Linux is removed from the base kernal and user space, which creates less room for data and information . The announcement of an attempt Debian Hurd propose an alternative solution to the kernal hacking. The kernal Hurd, which runs and sent as a set of processes on top of a microkernal as MACH, may provide a system for people who are dissatisfied with changes to the linux kernal. Mach has a "message passing" abstraction that allows the operating system created as a data set that will work in conjunction with one another.
Competitive, cooperative efforts
To start this section will tell you to start the PC and roots with IBM. Vertically integrated proprietary de facto standards architecture was the norm for the first three decades of the postwar computer industry. Each computer manufacturer, now all the technology, if not inside, and that the technology is sold as part of an integrated computer. This time the systems were steadily rising since 1964 the introduction of the IBM System 360 to release the 1981 personal computer from IBM. This was denied by two different approaches. One was the fragmentation of proprietary standards in the field of computers among the various suppliers, which led Microsoft and Intel to seek industry wide prevalence of proprietary component of the overall system architecture, making what Moschella (1997) in the era of the PC "(1964 - 1981). The second was a movement by customers and second tier manufacturers to cvonstruct industrywide «open" system in which the pattern belonged to a single company.
The adoption of the Linux system in late 1990 was a response to these earlier approaches. Linux was the most commercially acceptable example of a new wave of "open source" software, software source code is distributed free to use and modify. The advantages of Linux as opposed to proprietary standards PC, especially by the standards of the software tested by Microsoft. Specifications Product compatibility is generally considered using a simple typology unidemensional, bifurcated between "compatible" and "inconsistent." Even more to illuminate the differences between proprietary and open standards strategies, (1987) Gabel is multidimensional ranking status, with each dimension assuming one of the many (discrete) levels:
"Multivintage" compatibility between successive generations of the product:
"Order Products" compatibility, providing interoperability between the breadth of the company
range of products such as Microsoft with their families Windows CE, 95/98/ME, NT/2000 and product.
"Multivendors" compatibility, ie compatibility between products of competing manufacturers.
The first successful operating system was a multi-vendor Unix, developed by a team of computer science at Bell Telephone Laboratories (BTL), for the opening of New Jersey in 1969. As with the previous Multics research project between MIT, BTL and mainframe computer maker General Electric, Unix was a multi time-sharing operating system designed as a research project by the developers for their personal use. Other features key to the success of Unix path dependencies reflected by the developers and early users (Salus 1994):
AT & T was forbidden from 1956 consent decree from being in the business of computers, why not sell the operating system business. Following the publication of research papers, Bell Labs had been flooded with requests from the departments of computer science university, who received licenses and source code, but the lack of support. With the budget constraints that limit the cam researchers BTL to December contrast medium to large computer mainframe, Unix was simpler and more efficient than its predecessor Multics, based on a simplified programming language C and not the most widely used PL / I. Although originally developed in December, Unix converted to run on other models by users who have found time programmer cheaper than buying a supported model, so laying the foundation to become a hardware-independent operating system.
Perhaps one of the most significant development was the licensing of UNIX by the UC Berkeley Computer Science Department in 1973. The Berkeley group issued their own versions of 1977 to 1994, with much of its funding provided by the Defense Advanced Research Projects Agency (DARPA). The result of the development contained Berkeley (Garud and Kumaraswamy 1993? Salus 1994):
The first version of Unix to support the protocol TCP / IP, after the standard Internet protocols;
Academic version of BSD Unix as the preferred OS in several departments of computer science around the world;
Commercial distribution of BSD Unix derived through Sun Microsystems, cofounded by former programmer BSD Bill Joy?
It evolved versions of Unix, the fragmentation of Unix programmers and take on rival "BSD" and "AT & T" camps.
The AT & T Unix provided a standard different manufacturers, when combined with developments in BSD, helped spur the adoption of networked computers. Helped by the Sun, whose motto is "the network is the computer,« Unix quickly gained acceptance during the 1980's as the preferred OS for networked engineering workstations (Garud and Kumaraswamy 1993). At the same time, there was a real model from different manufacturers, such as producers minicomputer with a small amount of customers, the inability of R & D and immature OS license Unix from AT & T. The main exceptions to the head of Unix were among the first leaders in power work (Apollo) and middle (December), which used its own OS as a source of competitive advantage, and was the last to switch to Unix in their respective categories.
Some supporters of the two producers have created a range of professional associations to promote Unix and related operating systems. Thus fueled the adoption and standardization of Unix, which hoped to increase the amount of applications software to compete with the sponsorship, proprietary architectures (Gabel 1987? Grindley 1995). These two teams are promoted under the title "open systems"; Authors of a series of books on these systems are summarized objectives as follows:Open systems allow users to move applications between systems easily; Purchasing decisions can be based on the ratio of cost-effectiveness and vendor support, rather than on systems running a suite of application users (Salus 1994: v).
Despite these objectives, the Unix community spent 1980 and early 1990, fragmented into AT & T and Berkeley warring factions, each of which sought control of the OS API to maximize the software available for versions them. Each faction had its own supporters. To avoid paying the old mainframe before switching costs, the U.S. Defense Department procurement decisions began to favor a proprietary Unix systems. As AT & T formalized System V Interface Definition and encouraged hardware makers to adopt System V, became the model from different manufacturers are required by the commission of the Ministry of Defence
BSD group was only developed in middle December, Unix variant was no different manufacturers and less appealing and attractive to DoD contracts. Numerous innovations in the BSD group on usability, software development tools and networking made it more attractive for Computer Science for their research and teaching, making it the minicomputer OS they prefer the services of computer science in the U.S. , Europe and Japan (Salus 1994). The divergent innovation meant that the two major Unix variants differed in terms of internal structure, the user commands and application programming interfaces (API). It was the difference last hit more computer buyers, such as custom software developed for one type of Unix could not immediately be reviewed by the other, adding the transition costs of the two systems. Also, both the modem based on its DARPA networking facilitated the distribution of user-donated code libraries, which was free, but often required for site-specific custom programming, if the Unix API in user space differed from those faced by the the original donor.
Microsoft Windows continues to invest in a product family based processor Itanium, and the Itanium Solutions Alliance will promote investment, helping to develop the ecosystem of applications and solutions available for the Windows platform and SQL Server 2005, "said by Bob Kelly, general manager, Windows infrastructure, Microsoft Corp "We look forward to working with members of the Itanium Solutions Alliance to help IT managers move from RISC servers based on Unix to Itanium-based systems running on the platform Windows. "
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment