Showing posts with label CSE. Show all posts
Showing posts with label CSE. Show all posts

Fluorescent Multi-layer Disc


Requirements for removable media storage devices (RMSDs) used with personal computers have changed significantly since the introduction of the floppy disk in 1971. At one time, desktop computers depended on floppy disks for all of their storage requirements. Even with the advent of multigigabyte hard drives, floppy disks and other RMSDs are still an integral part of most computer systems, providing.

Transport between computers for data files and software
Backup to preserve data from the hard dive
A way to load the operating system software in the event of a hard failure.

            Data storage devices currently come in a variety of different capacities, access time, data transfer rate and cost per Gigabyte. The best overall performance figures are currently achieved using hard disk drives (HDD), which can be integrated into RAID systems (reliable arrays of inexpensive drives) at costs of $10 per GByte (1999). Optical disc drives (ODD) and tapes can be configured in the form of jukeboxes and tape libraries, with cost of a few dollars per GByte for the removable media. However, the complex mechanical library mechanism serves to limit data access time to several seconds and affects the reliability adversely.

            Most information is still stored in non-electronic form, with very slow access and excessive costs (e.g., text on paper, at a cost of $10 000 per GByte).

            Some RMSD options available today are approaching the performance, capacity, and cost of hard-disk drives. Considerations for selecting an RMSD include capacity, speed, convenience, durability, data availability, and backward-compatibility. Technology options used to read and write data include.

Magnetic formats that use magnetic particles and magnetic fields.

Optical formats that use laser light and optical sensors.

            Magneto-optical and magneto-optical hybrids that use a combination of magnetic and optical properties to increase storage capacity.

            The introduction of the Fluorescent Multi-layer Disc (FMD) smashes the barriers of existing data storage formats. Depending on the application and the market requirements, the first generation of 120mm (CD Sized) FMD ROM discs will hold 20 - 100 GigaBytes of pre -recorded data on 12 — 30 data layers with a total thickness of under 2mm.In comparison, a standard DVD disc holds just 4.7 gigabytes. With C3D’s (Constellation 3D) proprietary parallel reading and writing technology, data transfer speeds can exceed 1 gigabit per second, again depending on the application and market need.

Firewire

FireWire, originally developed by  Apple Computer, Inc is a cross platform implementation of the high speed  serial data bus –define by the  IEEE 1394-1995 [FireWire 400],IEEE 1394a-2000 [FireWire 800]  and IEEE  1394b standards-that move large amounts of data between computers and peripheral  devices. Its features simplified cabling, hot swapping and transfer speeds of upto 800 megabits per second. FireWire is a high-speed serial input/output (I/O) technology for connecting peripheral devices to a computer or to each other. It is one of the fastest peripheral standards ever developed and now, at 800 megabits per second (Mbps), its even faster .Based on Apple-developed technology, FireWire was adopted in 1995 as an official industry standard (IEEE 1394) for cross-platform peripheral connectivity. By providing a high-bandwidth, easy-to-use I/O technology, FireWire inspired a new generation of consumer electronics devices from many companies, including Canon, Epson, HP, Iomega, JVC, LaCie, Maxtor, Mitsubishi, Matsushita (Panasonic), Pioneer, Samsung, Sony and Texas Instruments. Products such as DV camcorders, portable external disk drives and MP3 players like the Apple iPod would not be as popular as they are today with-out FireWire. FireWire has also been a boon to professional users because of the high-speed connectivity it has brought to audio and video production systems. In 2001, the Academy of Television Arts & Sciences presented Apple with an Emmy award in recognition of the contributions made by FireWire to the television industry. Now FireWire 800, the next generation of FireWire technology, promises to spur the development of more innovative high-performance devices and applications. FireWire800 (an implementation of the IEEE 1394b standard approved in 2002) doubles the throughput of the original technology, dramatically increases the maximum distance of FireWire connections, and supports many new types of cabling. This technology brief describes the advantages of FireWire 800 and some of the applications for which it is ideally suited.

FRAM


A ferroelectric memory cell consists of a ferroelectric capacitor and a MOS transistor. Its construction is similar to the storage cell of a DRAM. The difference is in the dielectric properties of the material between the capacitor's electrodes. This material has a high dielectric constant and can be polarized by an electric field. The polarisation remains until it gets reversed by an opposite electrical field. This makes the memory non-volatile. Note that ferroelectric material, despite its name, does not necessarily contain iron. The most well-known ferroelectric substance is BaTiO3.
A Ferroelectric memory cell consists of a ferroelectric capacitor and a MOS transistor. Its construction is similar to the storage cell of a DRAM. The difference is in the dielectric properties of the material between the capacitor's electrodes. This material has a high dielectric constant and can be polarized by an electric field. The polarisation remains until it gets reversed by an opposite electrical field. This makes the memory non-volatile.
Data is read by applying an electric field to the capacitor. If this switches the cell into the opposite state (flipping over the electrical dipoles in the ferroelectric material) then more charge is moved than if the cell was not flipped. This can be detected and amplified by sense amplifiers. Reading destroys the contents of a cell which must therefore be written back after a read. This is similar to the precharge operation in DRAM, though it only needs to be done after a read rather than periodically as with DRAM refresh.
FRAM is found mainly in consumer devices and because of its low power requirements, could also be used in devices that only need to activate for brief periods. FRAM allows systems to retain information even when power is lost, without resorting to batteries, EEPROM, or flash. Access times are the same as for standard SRAM, so there's no delay-at-write access as there is for EEPROM or flash. In addition, the number of write cycles supported by the FRAM components is nearly unlimited—up to 10 billion read/writes. FRAM combines the advantages of SRAM - writing is roughly as fast as reading, and EPROM - non-volatility and in-circuit programmability

Extreme Ultraviolet Lithography


Silicon has been the heart of the world's technology boom for nearly half a century, but microprocessor manufacturers have all but squeezed the life out of it. The current technology used to make microprocessors will begin to reach its limit around 2005. At that time, chipmakers will have to look to other technologies to cram more transistors onto silicon to create more powerful chips. Many are already looking at extreme-ultraviolet lithography (EUVL) as a way to extend the life of silicon at least until the end of the decade.

Potential successors to optical projection lithography are being aggressively developed. These are known as "Next-Generation Lithographies" (NGL's). EUV lithography (EUVL) is one of the leading NGL technologies; others include x-ray lithography, ion-beam projection lithography, and electron-beam projection lithography. Using extreme-ultraviolet (EUV) light to carve transistors in silicon wafers will lead to microprocessors that are up to 100 times faster than today's most powerful chips, and to memory chips with similar increases in storage capacity.

Extreme Programming


Extreme Programming (XP) is actually a deliberate and disciplined approach to software development. About six years old, it has already been proven at many companies of all different sizes and industries worldwide. XP is successful because it stresses customer satisfaction. The methodology is designed to deliver the software your customer needs when it is needed. XP empowers software developers to confidently respond to changing customer requirements, even late in the life cycle. This methodology also emphasizes teamwork. Managers, customers, and developers are all part of a team dedicated to delivering quality software. XP implements a simple, yet effective way to enable groupware style development.
XP improves a software project in four essential ways; communication, simplicity feedback, and courage. XP programmers communicate with their customers and fellow programmers. They keep their design simple and clean. They get feedback by testing their software starting on day one. They deliver the system to the customers as early as possible and implement changes as suggested. With this foundation XP programmers are able to courageously respond to changing requirements and technology. XP is different. It is a lot like a jig saw puzzle. There are many small pieces. Individually the pieces make no sense, but when combined together a complete picture can be seen. This is a significant departure from traditional software development methods and ushers in a change in the way we program.
If one or two developers have become bottlenecks because they own the core classes in the system and must make all the changes, then try collective code ownership. You will also need unit tests. Let everyone make changes to the core classes whenever they need to. You could continue this way until no problems are left. Then just add the remaining practices as you can. The first practice you add will seem easy. You are solving a large problem with a little extra effort. The second might seem easy too. But at some point between having a few XP rules and all of the XP rules it will take some persistence to make it work. Your problems will have been solved and your project is under control. It might seem good to abandon the new methodology and go back to what is familiar and comfortable, but continuing does pay off in the end. Your development team will become much more efficient than you thought possible. At some point you will find that the XP rules no longer seem like rules at all. There is a synergy between the rules that is hard to understand until you have been fully immersed. This up hill climb is especially true with pair programming, but the pay off of this technique is very large. Also, unit tests will take time to collect, but unit tests are the foundation for many of the other XP practices so the pay off is very great.
more

E-mail Alert System


Today we are witnessing fast changes in telecommunications computer and telephone ate two technologies that have made significant revolution in communications, but for technological reasons they were developed separately. Fast development of communication and computer technology lead to the merging of the public switched telephone network (PSTN) and the internet to become global information network of integrated services. Internet services ate becoming a more important way of information exchange and communication, turning telephony and mobile telephony toward internet services.
One of the deficiencies of internet services over fixed and mobile telephony is the availability of service: internet services are available only when connected. The results of our research carried out before the development of the e-mail Alert (EMA) System show that internet users receive on average five to six e-mails every day and 82 percent of these users in the course of their internet connection check their mail box first. Thus there is a clear demand for the development of e-mail alerting systems. EMA system is computer telephony integration (CTI) application that integrates advantages of telephony and the internet by connecting e-mail and phone services. The EMA system will inform users of the arrival of new e-mail messages, which is convenient if  you don’t  allow  e-mail servers access from outside. On the other side are internet or service providers with a large number of users.  To satisfy both groups of requirements, two versions of EMA system are proposed. The enterprise version is developed in order to allow e-mail server access inside intranet environments, while the public version is designed for public service providers. The EMA system is implemented on Win 32   platform using c and c++ programming languages HTML, ASP, java Script and VB Script are used for the Web interface to overcome deference in Web browsers.

E-Intelligence


Organizations have, over the years, successfully employed business intelligence tools like OLAP and data warehousing to improve the supply of business information to end users for cross industry applications like finance and customer relationship management, and in vertical markets such as retail, manufacturing, healthcare, banking, financial services, telecommunications, and utilities. In the recent years, the Internet has opened up an entirely new channel for marketing and selling products. Companies are taking to e-business in a big way. The issue facing end users as organizations deploy e-business systems is that they do have not had the same business intelligence capabilities available to them in e-business systems as they do in the traditional corporate operating environment. This prevents businesses from exploiting the full power of the Internet as a sales and marketing channels.
As a solution, vendors are now developing business intelligence applications to capture and analyze the information flowing through e-business systems, and are developing Web-based information portals that provide an integrated and personalized view of enterprise-wide business information, applications, and services. This advanced business intelligence systems are called E-intelligence systems

E bomb


The next Pearl Harbor will not announce itself with a searing flash of nuclear light or with the plaintive wails of those dying of Ebola or its genetically engineered twin. You will hear a sharp crack in the distance. By the time you mistakenly identify this sound as an innocent clap of thunder, the civilized world will have become unhinged. Fluorescent lights and television sets will glow eerily bright, despite being turned off. The aroma of ozone mixed with smoldering plastic will seep from outlet covers as electric wires arc and telephone lines melt. Your Palm Pilot and MP3 player will feel warm to the touch, their batteries overloaded. Your computer, and every bit of data on it, will be toast. And then you will notice that the world sounds different too. The background music of civilization, the whirl of internal-combustion engines, will have stopped. Save a few diesels, engines will never start again. You, however, will remain unharmed, as you find yourself thrust backward 200 years, to a time when electricity meant a lightning bolt fracturing the night sky. This is not a hypothetical, son-of-Y2K scenario. It is a realistic assessment of the damage that could be inflicted by a new generation of weapons--E-bombs.
Anyone who's been through a prolonged power outage knows that it's an extremely trying experience. Within an hour of losing electricity, you develop a healthy appreciation of all the electrical devices you rely on in life. A couple hours later, you start pacing around your house. After a few days without lights, electric heat or TV, your stress level shoots through the roof. But in the grand scheme of things, that's nothing. If an outage hits an entire city, and there aren't adequate emergency resources, people may die from exposure, companies may suffer huge productivity losses and millions of dollars of food may spoil. If a power outage hit on a much larger scale, it could shut down the electronic networks that keep governments and militaries running. We are utterly dependent on power, and when it's gone, things get very bad, very fast.
An electromagnetic bomb, or e-bomb, is a weapon designed to take advantage of this dependency. But instead of simply cutting off power in an area, an e-bomb would actually destroy most machines that use electricity. Generators would be useless, cars wouldn't run, and there would be no chance of making a phone call. In a matter of seconds, a big enough e-bomb could thrust an entire city back 200 years or cripple a military unit.

Digital water marking


In recent years, the distribution of works of art, including pictures, music, video and textual documents, has become easier. With the widespread and increasing use of the Internet, digital forms of these media (still images, audio, video, text) are easily accessible. This is clearly advantageous, in that it is easier to market and sell one's works of art. However, this same property threatens copyright protection. Digital documents are easy to copy and distribute, allowing for pirating. There are a number of methods for protecting ownership. One of these is known as digital watermarking.
Digital watermarking is the process of inserting a digital signal or pattern (indicative of the owner of the content) into digital content. The signal, known as a watermark, can be used later to identify the owner of the work, to authenticate the content, and to trace illegal copies of the work.
Watermarks of varying degrees of obtrusiveness are added to presentation media as a guarantee of authenticity, quality, ownership, and source.
To be effective in its purpose, a watermark should adhere to a few requirements. In particular, it should be robust, and transparent. Robustness requires that it be able to survive any alterations or distortions that the watermarked content may undergo, including intentional attacks to remove the watermark, and common signal processing alterations used to make the data more efficient to store and transmit. This is so that afterwards, the owner can still be identified. Transparency requires a watermark to be imperceptible so that it does not affect the quality of the content, and makes detection, and therefore removal, by pirates less possible.
The media of focus in this paper is the still image. There are a variety of image watermarking techniques, falling into 2 main categories, depending on in which domain the watermark is constructed: the spatial domain (producing spatial watermarks) and the frequency domain (producing spectral watermarks). The effectiveness of a watermark is improved when the technique exploits known properties of the human visual system. These are known as perceptually based watermarking techniques. Within this category, the class of image-adaptive watermarks proves most effective.
In conclusion, image watermarking techniques that take advantage of properties of the human visual system, and the characteristics of the image create the most robust and transparent watermarks.

Digital theatre


Digital Theatre System (Digital cinema, or d-cinema) is perhaps the most significant challenge to the cinema industry since the introduction of sound on film. As with any new technology, there are those who want to do it fast, and those who want to do it right. Both points of view are useful. This new technology will completely replace the conventional theatre system having projectors, film boxes, low quality picture, sound system.
Let's not forget the lesson learned with the introduction of digital audio for film in the '90s. Cinema Digital Sound, a division of Optical Radiation Corporation, was the first to put digital audio on 35mm film. Very, very few remember CDS, who closed their doors long ago. Such are the rewards for being first.  

Diffserver


Today's Internet provides a best effort service. It processes traffic as quickly as possible, but there is no guarantee at all about timeliness or actual delivery: it just tries its best. However, the Internet is rapidly growing into a commercial infrastructure, and economies are getting more and more dependent on a high service level with regard to the Internet. Massive (research) efforts are put into transforming the Internet from a best effort service into a network service users can really rely upon.
Commercial demands gave rise to the idea of having various classes of service. For instance one can imagine that companies might offer (or buy, for that matter) either a gold, silver or bronze service level. Each of them having their own characteristics in terms of bandwidth and latency with regard to network traffic. This is called Quality of Service (QoS). The Internet Engineering Task Force (IETF), one of the main driving forces behind Internet related technologies, has proposed several architectures to meet this demand for QoS. Integrated Services and Differentiated Services, developed in the “intserv” and “diffserv” IETF Working Groups, are probably the best known models and mechanisms. The IETF diffserv WG has also defined a DiffServ Management Information Base, a virtual storage place for management information regarding DiffServ. At time of writing, this MIB is still work in progress. This assignment contributes to the development of the DiffServ MIB by writing a prototype implementation of a DiffServ MIB agent and giving feedback to the IETF community. One of the likely uses of the DiffServ MIB is that it may act as part of a bigger policy-based management framework. Therefore an implementation of the DiffServ MIB might also help development in that area.

DNA computer


Computer chip manufacturers are furiously racing to make the next microprocessor that will topple speed records. Sooner or later, though, this competition is bound to hit a wall. Microprocessors made of silicon will eventually reach their limits of speed and miniaturization. Chip makers need a new material to produce faster computing speeds.
 Millions of natural supercomputers exist inside living organisms, including your body. DNA (deoxyribonucleic acid) molecules, the material our genes are made of, have the potential to perform calculations many times faster than the world's most powerful human-built computers. DNA might one day be integrated into a computer chip to create a so-called biochip that will push computers even faster. DNA molecules have already been harnessed to perform complex mathematical problems.
While still in their infancy, DNA computers will be capable of storing billions of times more data than your personal computer. DNA can be used to calculate complex mathematical problems. However, this early DNA computer is far from challenging silicon-based computers in terms of speed. The Rochester team's DNA logic gates are the first step toward creating a computer that has a structure similar to that of an electronic PC. Instead of using electrical signals to perform logical operations, these DNA logic gates rely on DNA code. They detect fragments of genetic material as input, splice together these fragments and form a single output

Crusoe


Mobile computing has been the buzzword for quite a long time. Mobile computing devices like laptops, web slates & notebook PCs are becoming common nowadays. The heart of every PC whether a desktop or mobile PC is the microprocessor. Several microprocessors are available in the market for desktop PCs from companies like Intel, AMD, and Cyrix etc. The mobile computing market has never had a microprocessor specifically designed for it. The microprocessors used in mobile PCs are optimized versions of the desktop PC microprocessor. Mobile computing makes very different demands on processors than desktop computing, yet up until now, mobile x86 platforms have simply made do with the same old processors originally designed for desktops. Those processors consume lots of power, and they get very hot. When you're on the go, a power-hungry processor means you have to pay a price: run out of power before you've finished, run more slowly and lose application performance, or run through the airport with pounds of extra batteries. A hot processor also needs fans to cool it; making the resulting mobile computer bigger, clunkier and noisier. A newly designed microprocessor with low power consumption will still be rejected by the market if the performance is poor. So any attempt in this regard must have a proper 'performance-power' balance to ensure commercial success. A newly designed microprocessor must be fully x86 compatible that is they should run x86 applications just like conventional x86 microprocessors since most of the presently available software’s have been designed to work on x86 platform.
Crusoe is the new microprocessor which has been designed specially for the mobile computing market. It has been designed after considering the above mentioned constraints. This microprocessor was developed by a small Silicon Valley startup company called Transmeta Corp. after five years of secret toil at an expenditure of $100 million. The concept of Crusoe is well understood from the simple sketch of the processor architecture, called 'amoeba’. In this concept, the x86-architecture is an ill-defined amoeba containing features like segmentation, ASCII arithmetic, variable-length instructions etc. The amoeba explained how a traditional microprocessor was, in their design, to be divided up into hardware and software.
Thus Crusoe was conceptualized as a hybrid microprocessor that is it has a software part and a hardware part with the software layer surrounding the hardware unit. The role of software is to act as an emulator to translate x86 binaries into native code at run time. Crusoe is a 128-bit microprocessor fabricated using the CMOS process. The chip's design is based on a technique called VLIW to ensure design simplicity and high performance. Besides this it also uses Transmeta's two patented technologies, namely, Code Morphing Software and Longrun Power Management. It is a highly integrated processor available in different versions for different market segments.

corDECT Wireless in Local Loop System


corDECT is an advanced, field proven, Wireless Access System developed by Midas Communication Technologies and the Indian Institute of Technology, Madras, in association with Analog Devices Inc., USA.
corDECT provides a complete wireless access solution for new and expanding telecommunication networks with seamless integration of both voice and Internet services. It is the only cost-effective Wireless Local Loop (WLL) system in the world today that provides simultaneous toll-quality voice and 35 or 70 kbps Internet access to wireless subscribers.
corDECT is based on the DECT standard specification from the European Telecommunication Standards Institute (ETSI). In addition, it incorporates new concepts and innovative designs brought about by the collaboration of a leading R & D company, a renowned university, and a global semiconductor manufacturer. This alliance has resulted in many breakthrough concepts including that of an Access Network that segregates voice and Internet traffic and delivers each, in the most efficient manner, to the telephone network and the Internet respectively, without the one choking the other.

Computer Clothing


Wearable  computer  comprises  of  a  computer  built  within  an  ordinary  clothing. This  transformation  allows  it  to  be   worn  constantly,  with  the  goal  of  becoming  a  seamless  extension  of  body  and  mind. Equipped  with  various  sensors  which  measure  heart rate,  respiration,  footstep rate  etc,  the  apparatus  can  function  as  a  personal  safety  device  for  reducing  crime,  as  well  as  personal  health  monitor  for  improving   health care  by  encouraging  individuals  to  take  an  active  role  in  diagnosis  and  body  maintenance.  The  ‘wearable  computer’  apparatus  is  embedded   within  nontransparent   clothing  which  provides  shielding.  Electronic  circuits  are  built  entirely  out  of  textiles  to  distribute  data  and  power  and  perform  touch  sensing.  These  circuits  are  passive  components  sewn  from  conductive  yarns  as  well  as  conventional  components  to  create  interactive  electronic  devices,  such    as  musical  keyboards  and  graphic  input  surfaces.

Blue ray DVD


Tokyo Japan, February 19, 2002: Nine leading companies today announced that they have jointly established the basic specifications for a next generation large capacity optical disc video recording format called "Blu-ray Disc". The Blu-ray Disc enables the recording, rewriting and play back of up to 27 gigabytes (GB) of data on a single sided single layer 12cm CD/DVD size disc using a 405nm blue-violet laser.
By employing a short wavelength blue violet laser, the Blu-ray Disc successfully minimizes its beam spot size by making the numerical aperture (NA) on a field lens that converges the laser 0.85. In addition, by using a disc structure with a 0.1mm optical transmittance protection layer, the Blu-ray Disc diminishes aberration caused by disc tilt. This also allows for disc better readout and an increased recording density. The Blu-ray Disc's tracking pitch is reduced to 0.32um, almost half of that of a regular DVD, achieving up to 27 GB high-density recording on a single sided disc.
Because the Blu-ray Disc utilizes global standard "MPEG-2 Transport Stream" compression technology highly compatible with digital broadcasting for video recording, a wide range of content can be recorded. It is possible for the Blu-ray Disc to record digital high definition broadcasting while maintaining high quality and other data simultaneously with video data if they are received together. In addition, the adoption of a unique ID written on a Blu-ray Disc realizes high quality copyright protection functions.
The Blu-ray Disc is a technology platform that can store sound and video while maintaining high quality and also access the stored content in an easy-to-use way. This will be important in the coming broadband era as content distribution becomes increasingly diversified. The nine companies involved in the announcement will respectively develop products that take full advantage of Blu-ray Disc's large capacity and high-speed data transfer rate. They are also aiming to further enhance the appeal of the new format through developing a larger capacity, such as over 30GB on a single sided single layer disc and over 50GB on a single sided double layer disc. Adoption of the Blu-ray Disc in a variety of applications including PC data storage and high definition video software is being considered.

Cellular Digital Packet Data


Cellular Digital Packet Data (CDPD) systems offer what is currently one of the most advanced means of wireless data transmission technology. Generally used as a tool for business, CDPD holds promises for improving law enforcement communications and operations. As technologies improve, CDPD may represent a major step toward making our nation a wireless information society. While CDPD technology is more complex than most of us care to understand, its potential benefits are obvious even to technological novices.
In this so-called age of information, no one need to be reminded of speed but also accuracy in the storage, retrieval and transmission of data. The CDPD network is a little one year old and already is proving to be a hot digital enhancement to the existing phone network. CDPD transmits digital packet data at 19.2 Kbps, using idle times between cellular voice calls on the cellular telephone network.                             
CDPD technology represent a way for law enforcement agencies to improve how they manage their communications and information systems. For over a decade, agencies around the world have been experimenting with placing Mobile Data Terminals(MDT) in their vehicles to enhance officer safety and efficiency.
Early MDT’s transmits their information using radio modems. In this case data could be lost in transmission during bad weather or when mobile units are not properly located in relation to transmission towers. More recently MDT’s have transmitted data using analog cellular telephone modems. This shift represented an improvement in mobile data communications, but systems still had flaws which limited their utility.
Since the mid-1990’s, computer manufacturers and the telecommunication industry have been experimenting with the use of digital cellular telecommunications as a wireless means to transmit data. The result of their effort is CDPD systems. These systems allow users to transmit data with a higher degree of accuracy, few service interruptions, and strong security. In addition CDPD technology represent a way for law enforcement agencies to improve how they manage their communications and information systems. This results in the capacity for mobile users to enjoy almost instantaneous access to information.

Biometric Technology


BIOMETRICS refers to  the automatic   identification of  a  person based   on    his physiological / behavioral  characteristics. This method of  identification is preferred for various reasons;the person to be identified is required to be physically present at the point of  identification; identification based on biometric techniques obviates the need to remember a password or carry a token. With the increased use of computers or  vehicles of  information technology, it is necessary to restrict access to sensitive or personal  data.  By  replacing PINs, biometric   techniques  can   potentially  prevent unauthorized access to fraudulent use of ATMs, cellular phones, smart cards, desktop PCs,  workstations, and computer networks. PINs  and passwords may be forgotten, and token based methods of  identification  like passports and driver’s  licenses  may be forged, stolen,  or lost .Thus biometric   systems  of   identification  are   enjoying a renewed interest. Various  types of biometric systems are being used  for real–time identification ; the most popular  are  based  on   face   recognition   and  fingerprint matching. However there are other biometric systems that utilize iris and retinal scan, speech,  facial  thermo grams, and hand geometry.
A  biometric  system  is  essentially  a pattern   recognition system,  which   makes  a personal identification by determining  the authenticity of a specific physiological or behavioral  characteristics possessed by the  user. An  important issue in designing a practical  system is to determine  how an  individual is  identified.  Depending on the context, a  biometric system can be either a verification (authentication) system or an  identification system. There  are  two  different  ways  to  resolve  a   person’s   identity :  Verification  and Identification. Verification  ( Am  I  whom I claim  I  am ?)   involves  confirming  or denying a person’s  claimed identity. In Identification one has to establish a person’s identity  (whom am  I?). Each one  of these approaches  has its own complexities and could probably be solved best by a certain biometric system.
Biometrics  is rapidly evolving technology, which is  being  used  in forensics such as criminal identification and prison security, and has the potential to be used in a large range of civilian application areas . Biometrics can be used transactions conducted via telephone and  Internet (electronic  commerce and  electronic banking) . In  automobiles, biometrics can replace keys with key -less entry devices

Virtual keyboard


At their most basic, all keyboards, whether they're physical or virtual, are input devices -- once you type in a certain series of keystrokes, you're telling the keyboard to deliver a command to your computer. This allows you to write in a word-processing document, close out a program or write out a Web site's URL in a browser. But apart from the science-fiction element, what sets a virtual laser keyboard apart from a regular keyboard?

A traditional keyboard, one that hooks up to a desktop computer or is part of a laptop, is very much like another smaller computer. If you take it apart, it has a processor and circuitry similar to the insides of your computer. Underneath each key is a grid of circuits, and once you press a key, the switch closes. This sends a small electrical current through the grid, which the processor recognizes and analyzes. The processor, in turn, sends the information regarding your keystrokes to your computer, and it can do this several ways. Most desktop users connect their keyboard using cables, although common wireless technologies like Bluetooth let you type from a distance, as long as the computer has the necessary receiver. Laptop keyboards, on the other hand, connect directly to the computer's hardware.

When you type on a virtual laser keyboard, there aren't any switches involved. In fact, there aren't any mechanical moving parts at all. The device projects the image of a QWERTY keyboard onto a flat, non-reflective surface using a red diode laser. The laser, similar to the kind you see on those cheap laser pointers people wave at rock concerts, shines through a Diffractive Optical Element (DOE), which is simply a tiny image of the keyboard. The DOE, along with special optical lenses, expands the image to a usable size and projects it onto a surface.

But a simple image of a keyboard won't get you anywhere -- something needs to analyze the information you type in. Situated near the bottom of the device is an infrared (IR) laser diode, which shoots out a thin plane of infrared light. The plane, which is invisible and runs parallel to the surface, rests only millimeters above the image of the keyboard. When you start typing, you pass your fingers through certain areas of the infrared light. A CMOS (complimentary metal-oxide semiconductor) images your finger's position within the area of the keyboard, and a special sensor chip called a Virtual Interface Processing Core analyzes the location of the intended keystroke. The device then sends this information to the computer receiving the commands.

Blue eyes


The U.S. computer giant,IBM has been conducting research on the Blue Eyes technology at its Almaden Research Center (ARC) in San Jose, Calif., since 1997. The ARC is IBM's main laboratory for basic research. The primary objective of the research is to give a computer the ability of the human being to assess a situation by using the senses of sight, hearing and touch.
Animal survival depends on highly developed sensory abilities. Likewise, human cognition depends on highly developed abilities to perceive, integrate, and interpret visual, auditory, and touch information. Without a doubt, computers would be much more powerful if they had even a small fraction of the perceptual ability of animals or humans. Adding such perceptual abilities to computers would enable computers and humans to work together more as partners. Toward this end, the Blue Eyes project aims at creating computational devices with the sort of perceptual abilities that people take for granted.
Thus Blue eyes is the technology to make computers sense and understand human behavior and feelings and react in the proper ways.