Friday, January 20, 2012

What is OS (Operating system)?

An operating system (often referred to as OS) is an integrated set of programs that controls the resources (the CPU, memory, I/O devices, etc) of a computer system and provides its users with an interface or virtual machine that is more convenient to use than bare machine. According to this definition, the two primary objectives of an operating system are:

1.       Makin a computer system convenient to use- An operating system is a layer of software on top of the bare hardware of a computer system, which manages all parts of the system, and presents to the user with an interface or virtual machine, which is easier to program and use. That is, the operating system hides the details of the hardware resources from the programmer and provides the programmer with a convenient interface for using the computer system. It acts as an intermediary between the hardware and its users, providing a high-level interface to low-level hardware resources, and making it easier for the programmer and other users to access and use those resources.

The logical architecture of a computer system is shown in figure………..



As shown in the figure, the hardware resources are surrounded by the operating system layer, which in turn is surrounded by a layer of other system software (such as compilers, editors, utilities, etc.)and a set of application programs(such as commercial data processing applications, scientific and engineering applications, entertainment and educational applications,etc).Finally, the end users view the computer system in term of the user interfaces provided by the application programs.

2.       Managing the resources of a computer system. The second important objective of an operating system is to manage the various resources of the computer system. This involves performing such tasks as keeping track of who is using which resource, granting resource requests, accounting for resource usage, and mediating conflicting requests from different program and users. The efficient and fair sharing of resources among users and/or program is a key goal of most operating systems.


Main Functions of an Operating System

The main functions performed by most operating systems of today are as follows:

1.       Process Management- The process management module of an operating system takes care of the creation and deletion of process, scheduling of various system resources to the different processes requesting them, and providing mechanisms for synchronization and communication among process.

2.       Memory Management- The memory management module of an operating system takes care of the allocation and deallocation of memory space to the various programs in need of these resources.

3.       File Management-The file management module of an operating system takes care of file-related activities such as organization, storing, retrieval, naming, sharing, and protection of files.

4.       Security – The security module of an operating system protects the resources and information of a computer system against destruction and unauthorized access.

5.       Command Interpretation – The command interpretation module of an operating system takes care of interpreting user commands, and directing the system resources to handle the requests. With this mode of interaction with the system, the user is usually not too concerned with the hardware details of the system.

In addition to the above listed major functions, an operating system also performs few other functions such as keeping an account of which users (or process) use how much and what kinds of computer resources, maintainance of log system usage by all users, and maintenance of internal time clock.


Measuring System Performance

The efficiency of an operating system and the overall performance of a computer system are usually measured in terms of the following:-
1.                               Throughput – Throughput is the amount of work that the system is able to do per unit time. It is measured as the number of process that is completed by the system per unit time. For example, if n processes are completed in an interval of t seconds, the throughput is taken as n/t process per second during that interval.Throughtput is normally measured in process/hour. Note that the value of throughput does not depend only on the capability of a system, but also on the nature of jobs being processed by the system. For long process, throughtput may be one process/hour: and for short processes, throughtput may be 100 process/hour.
2.                               Turnaround time – From the point of view of an individual user, an important criterion is how long it takes the system to complete a job submitted by him/her. Turnaround time is the interval from the time submission of a job to the system for processing to the time of completion of the job.Although, higher throughput is desirable from the point of view of overall system performance. Individual users are more interested in better turnaround time for their jobs.
3.                               Response time - Turnaround time is usually not a suitable measure for interactive systems, because in an interactive system process can produce some output early during its execution and can continue executing while previous results are being output to the user.Hence, another measure used in case of interactive systems is response time, which is the interval from the time of submission of a job to the system for processing to the time the first response for the job is produced by the system.
In any computer system, it is desirable to maximize throughput and minimize turnaround time and response time.
Related Post:-


Sunday, January 8, 2012

Computer Animation Tutorial,2D,3D

Some typical applications of computer-generated animation are entertain-ment (motion pictures and cartoons), advertising, scientific and engineering studies and training and education. Although we tend to think of animation as implying object motions, the term Computer Animation generally refers to any time sequence of visual changes in a scene. In addition to changing object position with translations or rotations a computer-generated animation could dis-play time variations in object size, color, transparency, or surface texture.Adver-tising animations often transition one object shape into another.:- For Example, transforming a can of motor oil into an automobile engine, Computer animations can also be generated by changing camera parameters, such as position, orientation and focal length. And we can produce computer animations by changing lighting effects or other parameters and procedures associated with illumination and rendering.

Many applications of computer animation require realistic display. An ac-curate representation of the shape of thunderstorm or other natural phenomena described with a numerical model is important for evaluation the reliability of the model Also, simulators for training aircraft pilots and heavy equipment operators must produce reasonably accurate representations of the environment. Entertainment and advertising applications, on the other hand, are sometimes more interested in visual effects. Thus, sciences may be displayed with exaggerated shapes and unrealistic motions and transformations. There are many entertain-ment and advertising applications that do require accurate representations for computer generated scences.And in some scientific and engineering studies, real-ism is not a goal. For example, physical quantities are often displayed with pseudo-colors or abstract shapes that change over time to help the researcher un-distant the nature of the physical process.

Design of Animation Sequences

In general, an animation sequence is designed with the following steps:-

·         Storyboard layout

·         Object definitions

·         Key-frame specifications

·         Generation of in-between frames

This standard approach for animated cartoons is applied to other animation applications as well, although there are many special applications that do not follow this sequence. Real time computer animations produced by flight simulators, for instance, display motion sequences in response to settings on the aircraft controls. And visualization applications are generated by the solutions of the numerical models. For frame-by frame animation, each frame of the scene is separately generated and stored.Later, the frames can be recorded on film or they can be consecutively displayed in “real time playback” mode.

The storyboard is an outline of the action. It defines the motion sequence as a set of basic events that are to take place. Depending on the type of animation to be produced, the storyboard could consist of a set of rough sketches or it could be a list of the basic ideas for the motion.

An object definition is given for each participant in the action. Objects can be defined in terms of basic shapes, such as polygons or splines.In addition; the associated movements for each object are specified along with the shape.

A key frame is a detailed drawing of the scene at a certain time in the animation sequence. Within each key frame, each object is positioned according to the time for that frame. Some key frames are chosen at extreme positions in the action; others are spaced so that the time interval between key frames is not too great. More key frames are specified for intricate motions than for simple, slowly varying motions.

In between are the intermediate frames between the key frames. The number of in betweens needed is determined by the media to be used to display the animatin.Film requires 24 frames per second, and graphics terminals, are re-freshed at the rate of 30 to 60 frames per second. Typically time intervals for the motions are set up so that there are from three to five in betweens for each pair of key frames. Depending on the speed specified for the motion, some key frames can be duplicated. For a 1-minute film sequence with no duplication, we would need 1440 frames. With five in between film sequence with no duplication, we would need 1440 frames. With five in between for each pair of key frames, we would need 288 key frames. If the motion is not too complicated, we could space the key grams a little farther apart.

There are several other tasks that may be required, depending on the application. They include motion verficatin, editing, and production and synchronization of soundtrack. Many of the functions needed to produce general animations are now computer generated. Please check Example of Computer generated frames for animation sequences……

General Computer Animation Functions

Some steps in the development of an animation sequence are well suited to computer solution. These include object manipulations and rendering, camera motions and the generation of in between, Animation packages, such as Wav-front for example, provide special functions for designing the animation and processing individual objects.

One function available in animation packages is provided to store and manage the object database. Object shapes and associated parameters are stored and updated in the database. Other object functions include those for motion genera-tion and those for object rendering. Motions can be generated according to specified constraints using two-dimensional or three-dimensional transformations, Standard functions can then be applied to identify visible surfaces and apply the rendering algorithms.

Another typical function simulates camera movements. Standard motions are zooming, panning, tilting.Finally, given the specification for the key frames; the in-betweens can be automatically generated.
Related Post:-

Tuesday, November 1, 2011

Advantages of Firewall

Computers are constantly under the threat of being attacked by viruses, spyware and hackers. All organizations have critical and confidential information which can be accessed by hackers for misuse; a potential risk for any business. For this reason, it’s very important to incorporate security features when setting up a large or small business IT network in or around the Toronto area.

Using firewalls is one of the best ways to maintain network security and it’s an essential part of a company’s network protection system. Security software regulates the traffic between your network and the other networks that you access; it prevents unauthorized access to your network while allowing authorized access to continue as normal. With a firewall in place, an unauthorized attempt to access the can be blocked and encrypted, or allowed at the user’s discretion. An inner-network activity log is always maintained. 
 

You may have the best antivirus software installed on your computer, but only when it’s combined with a firewall is it most effective. Compatible firewall protection may be purchased directly from one of the antivirus companies, or from places offering managed IT services in Toronto or other parts of Ontario. It is very important to have a firewall if you have a computer network. It is advised that any business install it on each and every computer of its network. This way, if one computer within a network gets infected by a virus, it does not infect the other computers within the same network. 

If you’re not sure which antivirus software is best for your computer, you can seek the help of any company that provides computer support in Brampton, Markham, Vaughan or other areas in the GTA.
There are primarily two types of firewalls: software and hardware. Firewall software runs in the background of a computer. It captures each and every network request and decides whether it is authorized or unauthorized. A software firewall is not very expensive and it can be installed very easily once you identify the right version according to the operating system that you use.

Firewall hardware is normally a small box that is fitted between your computer and your modem. It offers better protection and you can use it on more than one computer; it doesn’t matter what type of operating system your computers operate on.

Firewall-How does it work

The Internet is vital for everyday business communication, accessing important data and information exchange, but with the use of this tool comes the threat of people who may attempt to access your network with harmful intentions. Hence, it has become imperative to control the access to the information in your databases. You can do so with the help of a solid firewall; any professional IT consultant in Toronto or outside of the city would agree.


What is a firewall?


Think of a firewall as a traffic policeman who controls and restricts the traffic between your computer the Internet. It’s a must when setting up large or small business IT networks in Toronto and the Greater Toronto Area. With a firewall installed, all information within the internal network of an organization is protected from the numerous computers connected to the web.

How does it work?

Each computer in has a unique IP address that enables it to connect to other computers, which allows other users to transfer files to and from other computers in the network. While the Internet has many safety features that control the access from different computers -- and most people do adhere to privacy protocols -- the risk of unauthorized access to your computer remains generally high.

Firewalls save a record of IP addresses and help you to control external access to your computer. When a computer tries to access your computer you receive an alert; then it’s up to you to authorize access. You can program your firewall to authorize access to desired computers or networks, and restrict the access of unwanted computers or networks. This is extremely useful in reducing the threat of viruses, spyware and malware; it even protects your computers from potential hackers.

Using a firewall protects the data and information that is sent along your office network, so it is a must for any business and individual to install one for any computer in use. Any unauthorized access to your computer can leave your personal or professional information exposed to hackers who may use your confidential information for selfish or destructive motives.

A good firewall can go a long way in protecting your network from any unauthorized access. If you’re in the area, you can seek advice from companies that provide IT support in Toronto, Mississauga, Oakville or other parts of the GTA to help with firewall installation and any other security needs your network may gave.

Tuesday, September 20, 2011

Information Technology(IT) Consulting, Firms

Technological advancements have made it very necessary for every firm to have a proper control and maintenance of their IT related activities. If a company does not have its own IT department, it may avail the services of IT consulting firms. These firms provide solutions to IT related requirements such Email and Spam Control, Network Security, Data Back-up, IT Budgeting and Planning.


IT Consulting - Types of Consulting Firms

IT consulting firms can be broadly divided into three types:

Professional Services Firms: These firms are very large in size. They can easily provide IT solutions for a large company. These firms are most trusted all across the world. They have a reputation of providing the best solutions. Their services are somewhat costly.

Staffing Firms: These firms are in effect for the period when they are handling your business. They employ people for a short duration. They phase out once they are through with the contract with a company.

Independent consultants: They are firms which have knowledge about any technological developments. They are competent enough to handle your business needs. They contribute a lot in the growth of their clients. Independent consultants are considered to be the best choice of all.

Main Concerns handled
IT Consultants mainly handle three concerns:

Data Integrity: A company’s data is one of the most critical asset of any organization. It includes critical information such as customer details, orders placed by customers, payment records etc. Thus, it becomes very important for every organization to protect its data. The services of IT consultants are often hired to ensure data integrity. They help in maintaining back-up of all the important data.

Security: A company has to protect its network from various threats such as hacking, viruses, Malware, etc. IT Consultants provide superior protection from all these threats by providing Firewalls and Anti-virus programs. It is imperative to let these consultants handle all your security needs. They have expertise in this field and can handle your security requirements in the best possible manner.

System Performance: It is very important for a company’s IT system to perform in the most efficient manner. A company’s business can prosper and grow only if it has proper support. IT consultants provide you the best advice. They help you invest in systems which give you best performance and increased protection. They play a vital role in the success of a company.

Why IT Consultant is important for Us

The reliance of mankind on technology has increased with time. We are dependent on computers for most of our work. Information technology plays a very important role in day to day operations of every business. There are certain IT services which are essential for every business. Firms seek the help of IT consultants to cater to their needs.

The most common services offered by IT consultants are:

Email and Spam Control

Email has become the most extensively used tool for personal as well as professional communication. Businesses greatly rely on email, to carry business communication with their business associates and clients. Spam is the biggest problem faced by us while using emails. Each day, hundreds of spam emails are filtered out by spam filters. If these filters stopped functioning any day, it would greatly disrupt our electronic communication activities. You may not be able to reply to important business emails, which may even lead to lost business opportunities. Companies avail the services of IT consultants, to control the Spam filtration tools effectively.

Network Security

Our networks are often vulnerable to the attack of virus or malware. This attack could be from internal or external sources. Thus, network security becomes is an inevitable requirement for every business. Some companies handle network security through on site measures while others outsource it to IT consultants. Outsourcing is a better measure as the IT consultants are experts in this domain. Each day, new threats and viruses, are generated. These consultants are equipped with the latest development in network protection. They offer a high degree of protection from all kinds of security threats.

Budgeting and Planning

There are many IT related investments that a company has to make from time to time. They depend on the company’s requirement. A company may not have a separate IT department. Hence, it becomes very difficult to get an idea of the budget required for such needs. In the absence of a proper budget, the company may end up spending either too much or too less. This may result in a chaotic situation. A company may seek the help of consulting services for budgeting and planning.

Data Back-Up and Disaster Recovery

It is very important to maintain a back-up of the company’s data. It is always advisable to maintain it off site through IT consultants. Onsite maintenance is not an intelligent move, as your data may be destroyed in case of fire, theft or a natural calamity.

Sunday, May 29, 2011

The Evolution of computers

Necessity is the mother of invention. The saying holds true for computers also, because computers were invented because of man’s search for fast and accurate calculating devices.
The first mechanical adding machine was invented by Blaise Pascal in 1642.Later, in the year 1671; Baron Gottfried Wilhelm von Leibniz of Germany invented the first calculator for multiplication. Keyboard machines originated in the United States around 1880 and are extensively used even today. Around this period only, Herman Hollerith came up with the concept of punched cards, which were extensively used as input medium in computers even in late 1970s, Business machines and calculators made their appearance in Europe and America towards the end of the nineteenth century.

Charles Babbage, a nineteenth century Professor at Cambridge University, is considered the faster of modern digital computers. During his period, mathematical and statistical tables were prepared by a group of clerks. Even the utmost care and precautions could not eliminate human errors. Babbage had to spend several hours checking these tables. Soon he became dissatisfied and exasperated with this type of monotonous job. The result was that he started thinking to build a machine, which could compute tables guaranteed to be error-free. In this process, Babbage designed a “Difference Engine “in the year 1822, which could produce reliable tables. In 1842, Babbage came out with his new idea of Analytical Engine, Which was intended to be completely automatic. It was to be capable of performing the basic arithmetic functions for any mathematical problem, and it was to do so at an average speed of 60 additions per minute. Unfortunately, he was unable to produce a working model of this machine, because the precision engineering required to manufacture the machine was not available during that period. However, his efforts established a number of principles, which have been shown to be fundamental to the design of any digital computer. In order to have better idea of the evolution of computers, let us now briefly discuss about some of the well-known early computers. These are as follows:
1.       The Mark 1st Computer (1937-44. Also known as Automatic Sequence Controlled calculator, this was the first fully automatic calculating machine designed by Howard A. Aiken of Harvard University, in collaboration with IBM (International Business Machines).Corporation .Its design was based on the techniques already developed for punched card machinery. It was an electro-mechanical device, since both mechanical and electronic components were used in its design.
Although this machine proved to be extremely reliable, It was very complex in design and huge in seize. It used over 3000 electrically actuated switches to control its operations, and was approximately 50 feet long and 9 feet high. It was capable of performing five basic arithmetic operations: addition, subtraction, multiplication, division and table reference. A number as big 23 decimal digits could be used in this machine. It took approximately 0.3 second to add two numbers and 4.5 seconds for multiplication of two members. Hence, the machine was very slow as compared to today’s computers.
2.       The Atanasoff-Berry Computer (1939-42). This electronic machine was developed by Dr.John Atanasoff to solve certain mathematical equations. It was called the Atanasoff-Berry Computers, Or ABC,after its inventor’s name and his assistant, Clifford Berry. It used 45 vacuum tubes for internal logic and capacitors for storage.
3.       The ENIC (1943-46). The Electronic Numerical Integrator and Calculator (ENIAC) was the first all electronic computer. It was constructed at the Moore School of Engineering of the University of Pennsylvania, U.S.A by a design team led by Professor J.Presper Eckert and John Mauchly.

ENIC was developed because of military need, and was used for many years to solve ballistic problems, It took up the wall space in a 20*40 square feet room and used 18,000 vacuum tubes, The addition of two numbers was achieved in 200 microseconds, and multiplication in 2000 microseconds.
4.       The EDVAC (1946-56). A major drawback of ENIAC was that its program was wired on boards, which made it difficult to change the programs. This problem was later overcome by the “stored program” concept introduced by Dr.John Von Neumann. The basic idea behind this concept is that a sequence of instructions, as well as data, can be stored in the memory of the computers, for automatically directing the flow of operations. This feature considerably influenced the development of modern digital computers because of the ease with which different programs can be loaded and executed on the same computer. Due to this feature, we often refer to modern digital computers as stored program digital computers. The Electronic Discrete Variable Automatic Computer (EDVAC) was designed on stored both instructions and data in the binary form (a system that uses only two digits -0 & 1 to represent all characters), instead of the decimal numbers or human readable words.
5.       The EDSAC (1947-49). Almost simultaneously with EDVAC of U.S.A, the Bruisers developed the Electronic Deley Storage Automatic Calculator (EDSAC). The machine executed its first program in May 1949.In this machine, addition operation was accomplished in 1500 microseconds, and multiplication operation in 400 microseconds. The machine was developed by a group of scientists, head by Professor Maurice Wilikes, at the Cambridge University Mathematical Laboratory.
6.       The UNIVAC 1st (1951). The Universal Automatic Computer (UNIVAC) was the first digital computer, which was not “one of kind”. Many UNIVAC machines were produced, the first of which was installed in the Census Bureau in 1951 and was used continuously for 10 years. The first business use of a computer, a UNIVAC 1st, was by General Electric Corporation in 1954.

In 1952, the International Business Machines (IBM) Corporation introduced the 701 commercial computers. In rapid succession, improved models of the UNIVAC 1st and other 700-series machines were introduced. In 1053, IBM produced the IBM-650, and over 1000 of these computers.

The commercially available digital computers, which could be used for business and scientific applications, had arrived.