Showing posts with label EEE. Show all posts
Showing posts with label EEE. Show all posts

SEMINAR ON FRACTAL IMAGE COMPRESSION

SEMINAR-FRACTAL-IMAGE-COMPRESSION (INTRODUCTION)
            The subject of  this  work  is  image  compression  with  fractals. Today  JPEG  has  become  an  industrial  standard  in  image  compression. Further  researches  are  held  in  two  areas, wavelet  based  compression  and  fractal  image  compression. The  fractal  scheme  was  introduced  by  Michael F Barnsley  in  the  year  1945.His  idea  was  that  images  could  be  compactly  stored  as  iterated  functions  which  led  to  the  development  of  the  IFS  scheme  which  forms  the  basis  of  fractal  image  compression. Further  work  in  this  area  was  conducted  by  A.Jacquin, a  student  of  Barnsley  who  published  several  papers  on  this  subject. He  was  the  first  to  publish  an  efficient  algorithm  based  on  local  fractal  system.

Energy transmission system for an artificial heart- leakage inductance compensation


The artificial heart now in use, like the natural heart it is designed to replace , is a four –chambered device for pumping blood. such electrical circulatory assist devices such as total artificial heart or ventricular assist devices generally use a brushless dc motor as their pump They require 12–35 W to operate and can be powered by a portable battery pack and a dc–dc converter.
It would be desirable to transfer electrical energy to these circulatory assist devices transcutaneously without breaking the skin. This technique would need a power supply which uses a transcutaneous transformer to drive  use,the motor for the circulatory assist devices. The secondary of this transformer would be implanted under the skin, and the primary would be placed on top of the secondary, external to the body. The distance between the transformer windings would be approximately equal to the thickness of the patient’s skin, nominally between 1–2 cm. This spacing cannot be assumed constant; the alignment of the cores and the distance between them would certainly vary during the operation.
A transformer with a large (1–2 cm) air gap between the primary and the secondary has large leakage inductances. In this application, the coupling coefficient k ranges approximately from 0.1 to 0.4. This makes the leakage inductances of the same order of magnitude and usually larger than the magnetizing inductance. Therefore, the transfer gain of voltage is very low, and a significant portion of the primary current will flow through the magnetizing inductance. The large circulating current through the magnetizing inductance results in poor efficiency.
A dc–dc converter employing secondary-side resonance has been reported to alleviate the problems by lowering the impedance of the secondary side using a resonant circuit .Although the circulating current is lowered, the transfer gain of the voltage varies widely as the coupling coefficient varies .So, advantages characteristics are reduced as the coupling coefficient deviates at a designated value.
In this paper, compensation of the leakage inductances on both sides of the transcutaneous transformer is presented. This converter offers significant improvements over the converter presented in the following aspects.

·         High-voltage gain with relative small variation with respect to load change as well as the variation of the coupling coefficient of the transformer—this reduces the operating frequency range and the size of the transcutaneous transformer is minimized.

·         Higher efficiency—minimize circulating current of magnetizing inductance and zero-voltage switching (ZVS) of the primary switches, and zero-current switching (ZCS) of the secondary rectifier diodes improves the efficiency significantly, especially at the secondary side (inside the body).

More ece and eee topics

E-mail Alert System


Today we are witnessing fast changes in telecommunications computer and telephone ate two technologies that have made significant revolution in communications, but for technological reasons they were developed separately. Fast development of communication and computer technology lead to the merging of the public switched telephone network (PSTN) and the internet to become global information network of integrated services. Internet services ate becoming a more important way of information exchange and communication, turning telephony and mobile telephony toward internet services.
One of the deficiencies of internet services over fixed and mobile telephony is the availability of service: internet services are available only when connected. The results of our research carried out before the development of the e-mail Alert (EMA) System show that internet users receive on average five to six e-mails every day and 82 percent of these users in the course of their internet connection check their mail box first. Thus there is a clear demand for the development of e-mail alerting systems. EMA system is computer telephony integration (CTI) application that integrates advantages of telephony and the internet by connecting e-mail and phone services. The EMA system will inform users of the arrival of new e-mail messages, which is convenient if  you don’t  allow  e-mail servers access from outside. On the other side are internet or service providers with a large number of users.  To satisfy both groups of requirements, two versions of EMA system are proposed. The enterprise version is developed in order to allow e-mail server access inside intranet environments, while the public version is designed for public service providers. The EMA system is implemented on Win 32   platform using c and c++ programming languages HTML, ASP, java Script and VB Script are used for the Web interface to overcome deference in Web browsers.

Electrical and chemical diagnostics of transformer insulation


The main function of a power system is to supply electrical energy to its customers with an acceptable degree of reliability and quality. Among many other things, the reliability of a power system depends on trouble free transformer operation. Now, in the electricity utilities around the world, a significant number of power transformers are operating beyond their design life. Most of these transformers are operating without evidence of distress. The same situation is evident in Australia. In PowaaerLink Queensland (PLQ), 25% of the power transformers were more than 25 years old in 1991. So priority attention should be directed to research into improved diagnostic techniques for determining the condition of the insulation in aged transformers.
The insulation system in a power transformer consists of cellulosic materials (paper, pressboard and transformerboard) and processed mineral oil. The cellulosic materials and oil insulation used in transformer degrade with time. The degradation depends on thermal, oxidative, hydrolytic, electrical and mechanical conditions which the transformer experienced during its lifetime.
The condition of the paper and pressboard insulation has been monitored by (a) bulk measurements (dissolved gas analysis (DGA) insulation resistance (IR), tanö and furans and (b) measurements on samples removed from the transformer (degree of polymerization (DP) tensile strength). At the interface between the paper and oil in the transformer, interfacial polarization may occur, resulting in an increase in the loss tangent and dielectric loss. A DC method was developed for measuring the interfacial polarization spectrum for the determination of insulation condition in aged transformers.
This paper makes contributions to the determination of the insulation condition of transformers by bulk measurements and measurements on samples removed from the transformer. It is based on a University of Queensland research project conducted with cooperation from the PLQ and the GEC-Alsthom.
Most of the currently used techniques have some drawbacks. Dissolved gas analysis requires a data bank based on experimental results from failed transformers for predicting the fault type. When transformer oil is rep or refurbished, the analysis of furans in the refurbished oil may not show any trace of degradation, although the cellulose may have degraded significantly. DP estimation is based on a single-point viscosity measurement. Molecular weight studies by single-point viscosity measurements are of limited value when dealing with a complex polymer blend, such as Kraft paper, particularly in cases where the molecular weight distribution of the paper changes significantly as the degradation proceeds. In these instances, a new technique, gel permeation chromatography (GPC), is likely to be more useful than the viscosity method, because it provides information about the change in molecular weight and molecular weight distribution. Investigation of the GPO technique has been included in this research to assess its effectiveness in determining the condition of insulation.
Conventional electrical properties (dissipation factor and breakdown strengths) of cellulosic materials are not significantly affected by ageing .so very little recent research has been directed to electrical diagnostic techniques, in this research project, thorough investigations were also undertaken of the conventional electrical properties, along with interfacial polarization parameters of the cellulosic insulation materials. The interfacial phenomena are strongly influenced by insulation degradation products, such as polar functionalities, water etc. The condition of the dielectric and its degradation due to ageing can be monitored by studying the rate and process of polarization and can be studied using a DC field. Furthermore, this is a non-destructive diagnostic test.
A retired power transformer (25 MVA, l1/132 kV) and several distribution transformers were used for the experimental work. The results from these transformers will be presented and an attempt will be made to correlate the electrical and chemical test results. The variation of the results through the different locations in a power transformer will be discussed with reference to their thermal stress distribution. Accelerated ageing experiments were conducted to predict the long term insulation behaviour and the results are presented in the accompanying paper.

E bomb


The next Pearl Harbor will not announce itself with a searing flash of nuclear light or with the plaintive wails of those dying of Ebola or its genetically engineered twin. You will hear a sharp crack in the distance. By the time you mistakenly identify this sound as an innocent clap of thunder, the civilized world will have become unhinged. Fluorescent lights and television sets will glow eerily bright, despite being turned off. The aroma of ozone mixed with smoldering plastic will seep from outlet covers as electric wires arc and telephone lines melt. Your Palm Pilot and MP3 player will feel warm to the touch, their batteries overloaded. Your computer, and every bit of data on it, will be toast. And then you will notice that the world sounds different too. The background music of civilization, the whirl of internal-combustion engines, will have stopped. Save a few diesels, engines will never start again. You, however, will remain unharmed, as you find yourself thrust backward 200 years, to a time when electricity meant a lightning bolt fracturing the night sky. This is not a hypothetical, son-of-Y2K scenario. It is a realistic assessment of the damage that could be inflicted by a new generation of weapons--E-bombs.
Anyone who's been through a prolonged power outage knows that it's an extremely trying experience. Within an hour of losing electricity, you develop a healthy appreciation of all the electrical devices you rely on in life. A couple hours later, you start pacing around your house. After a few days without lights, electric heat or TV, your stress level shoots through the roof. But in the grand scheme of things, that's nothing. If an outage hits an entire city, and there aren't adequate emergency resources, people may die from exposure, companies may suffer huge productivity losses and millions of dollars of food may spoil. If a power outage hit on a much larger scale, it could shut down the electronic networks that keep governments and militaries running. We are utterly dependent on power, and when it's gone, things get very bad, very fast.
An electromagnetic bomb, or e-bomb, is a weapon designed to take advantage of this dependency. But instead of simply cutting off power in an area, an e-bomb would actually destroy most machines that use electricity. Generators would be useless, cars wouldn't run, and there would be no chance of making a phone call. In a matter of seconds, a big enough e-bomb could thrust an entire city back 200 years or cripple a military unit.

C T SCAN


There are two main limitations of using conventional x-rays to examine internal structures of the body. Firstly superimpositions of the 3-dimensional information onto a single plane make diagnosis confusing and often difficult. Secondly the photographic film usually used for making radiographs has a limited dynamic range and therefore only object that have large variation in the x-ray absorption relative to their surroundings will cause sufficient contrast differences on the film to be distinguished by the eye. Thus the details of bony structures can be seen, it is difficult to discern the shape and composition of soft tissue organ accurately.
CT uses special x-ray equipment to obtain image data from different angles around a body and then shows a cross section of body tissues and organs. i.e., it can show several types of tissue-lung,bone,soft tissue and blood vessel with great clarity. CT of the body is a patient friendly exam that involves little radiation exposure
In CT scanning, the image is reconstructed from a large number of absorption profiles taken at regular angular intervals around a slice, each profile being made up from a parallel set of absorption values through the object. ie, CT also passes x-rays through the body of the patient but the detection method is usually electronic in nature, and the data is converted from analog signal to digital impulses in an AD converter. This digital representation of the x-ray intensity is fed in to a computer, which then reconstruct an image.
The method of doing of tomography uses an x-ray detector which translates which translates linearly on a track across the x-ray beam, and when the end of the scan is reached the x-ray tube and the detector are rotated to a new angle and the linear motion is repeated. The latest generation of CT machines use a ‘fan-beam’ geometry with an array of detectors which simultaneously detect x-rays on a number of different paths through the patient.

Biometric Fingerprint Identification


Positive identification of individuals is a very basic societal requirement. Reliable user authentication is becoming an increasingly important task in the web –enabled  world. The  consequences  of  an  insecure  authentication  system  in  a corporate or enterprise environment can be catastrophic, and may include loss of confidential information, denial of service, and compromised  data integrity. The value of  reliable user  authentication is  not limited to  just computer  or  network access. Many other applications in every day life also require user authentication, such as banking, e-commerce, and could benefit from enhanced security.
In fact, as more interactions take electronically, it becomes even more important to have an electronic verification of a person’s identity. Until recently, electronic verification took one of two forms. It was based on something the person had in their possession, like a magnetic swipe card, or something they knew, like a password. The problem is, these forms of electronic identification are not very secure, because they can be given away, taken away, or lost and motivated people have found ways to forge or circumvent these credentials.
The ultimate form of electronic verification of a person’s is biometrics. Biometrics refers to the automatic identification of a person based on his/her physiological or behavioral characteristics such as finger scan, retina, iris, voice scan, signature scan etc. By using this technique physiological characteristics of a person can be changed into electronic processes that are inexpensive and easy to use. People have always used the brain’s innate ability to recognize a familiar face and it has long been known that a person’s fingerprints can be used for identification.
IDENTIFICATION AND VERIFICATION SYSTEMS
A person’s identity can be resolved in two ways: identification and verification. The former involves identifying a person from all biometric measurements collected in a database and this involves a one-to-many match also referred to as ‘cold search’. “Do I know who you are”? is the inherent question this process seeks to answer. Verification involves authenticating a person’s claimed identity from his or her previously enrolled pattern and this involves a one to one match. The question it seeks to answer is, “Are you claim to be?”
VERIFICATION
Verification involves comparing a person’s fingerprint to one that pass previously recorded in the system database. The person claiming an identity provided a fingerprint, typically by placing on a capacitance scanner or an optical scanner. The computer locates the previous fingerprint by looking at the person’s identity. This process is relatively easy because the computer needs to compare two fingerprint records. The verification process is referred as a ‘closed search’ because the search field is limited. The second question is “who is this person?” This is the identification function, which is used to prevent duplicate application or enrollment. In this case a newly supplied fingerprint is supplied to all others in the database.  A match indicates that the person has already enrolled/applied.
IDENTIFICATION
The identification process, also known as an ‘open search’, is much more technically demanding. It involves many more comparisons and may require differentiating among several database fingerprints that are similar to the objects.

Bioinformatics and its evolution


Bioinformatics is an inter disciplinary research area. It is a fusion of computing, biotechnology and biological sciences. Bioinformatics is poised to one of the most prodigious growth areas in the next to decades. Being the interface between the most rapidly advancing fields of biological and computational sciences, it is immense in scope and vast in applications.
Bioinformatics is the study of biological information as it passes from its storage site in the genome to the various gene products in the cell. Bioinformatics involves the creation and computational technologies for problems in molecular biology. As such ,it deals with methods for storing, retrieving and analyzing biological data, such as nuclei acid (DNA/RNA)and protein sequence, structures, functions, path ways and interactions. The science of Bioinformatics, which is the melding of molecular biology with computer science is essential to the use of genomic information in understanding human diseases and in the identification of new molecular targets of drug discovery. New discoveries are being made in the field of genomics, an area of study which looks at the DNA sequence of an organism in order to determine which genes code for beneficial traits and which genes are involved in inherited diseases.
 If you are not tall enough, the stature could be altered accordingly. If you are weak and not strong enough, your physique could be improved. If you think this is the script for a science fiction movie, you are mistaken. It is the future reality.
2. EVOLUTION OF BIOINFORMATICS
DNA is the genetic material of organism. It contains all the information needed for the development and existence of an organism. The DNA molecule is formed of two long polynucleotide chains which are spirally coiled on each other forming a double helix. Thus it has the form of spirally twisted ladder. DNA is a molecule made from sugar, phosphate and bases. The bases are guanine (G), cytosine(C)adenine(A) and thiamine(T).Adenine pairs only with Thiamine and Guanine pairs only with Cytosine. The various combinations of these bases make up with DNA. That is; AAGCT, CCAGT, TACGGT etc. An infinite number of combinations of these bases is possible. And then the gene is a sequence of DNA that represents a fundamental unit of heredity. Human genome consists of approximately 30,000 genes, containing approximately 3 billion base pairs.
Currently, scientists are trying to determine the entire DNA sequence of various living organisms. DNA sequence analysis could identify genes, regulatory sequences and other functions. Molecular biology, algorithms, and computing have helped in sequencing larger portions of genomics of several species. Sequence is the determination of the order of nucleotides in a DNA as also the order of amino acids in a protein. Sequence analysis, which is at the core of   bioinformatics, enables function identification of genes.
The human found in every cell of a human being consists of 23pairs of chromosomes. These chromosomes constitute the 3 billion letters of chemical code that specify the blue print for a human being. Human Genome Project, one of the best known projects in the world. The world Human Genome Project, a vast endeavor aimed at reading this entire DNA code will completely transform biology, medicine and biotechnology. Using this entire code all 30,000 human genes will be identified; all 5000 inherited diseases will become diagnosable and potentially curable; and drug design will be completely transformed. The Genome Project focuses on two main objective: mapping-pinpointing the genomic location of all genes and markers; and DNA sequencing-reading the chemical "text" of all the genes and their intervening sequences. DNA sequences are entered in to large data bases, where they can be compared with the known genes, including inter-species comparisons. The explosion of publicly available genomic information resulting from the Human Genome Project has precipitated the need for bioinformatics capabilities.
Determination of genome organization and gene regulation will promote the understanding of how humans develop from single cells to adults, why this process some times goes wrong, and the changes that take place as people age. Bioinformatics finds applications in medicine for recommending individually tailored drugs based on an individual's profile. It helps to identify a specific genetic sequence that is responsible for a particular disease, its associated protein, and protein function. For curing the disease a new drugs can be developed.

WITRICITY- WIRELESS ELECTRICITY


Don't you hate it when you forget to put your mobile phone on charge? Well, take heart - a new technology called WiTricity could mean never having to plug it in again. Welcome to the world of WiTricity. WiTricity, a portmanteau for wireless electricity, is a term which describes wireless energy transfer, the ability to provide electrical energy to remote objects without wires. The term was coined initially in 2005 by Dave Gerding and later used for the project of a MIT research team led by Prof. Marin Solijaci.
The wireless electricity works on the principle of using coupled resonant objects for the transfer of electricity to objects without the use of any wires. This concept of witricity was made possible using resonance where an object vibrates with the application of a certain frequency of energy. The MIT researchers have been able to power a 60 watt light bulb from a power source that is located about seven feet away. This was made possible using two copper coils that were twenty inches in diameter which were designed so that they resonated together in the MHz range. One of these coils were connected to a power source while the other, to a bulb. With this witricity setup, the bulb got powered even when the coils were not in sight.
The main advantages of witricity are that it is omni directional; the mess of wires can also be avoided .Thus enabling us in easy recharging of our electronic gadgets likes mobiles and laptops. Also, interactions of the environmental objects with the magnetic fields are suppressed since there is no tendency of interaction with the common materials. Its discovery is different from all previous effort because it uses “magnetically coupled resonance", which means it will not only be safe but it will be fairly efficient.
This technology is a big impediment to development in the retail sector right now. The wireless transfer of electricity has been a sci-fi dream up to this point, and truly, if electricity could simply be in the air, in the same way radio waves and wi-fi signals are, it would change the world.

Electrodeless lamps


In contrast with all other electrical lamps that use electrical connections through the lamp envelope to transfer power to the lamp, in electrode less lamps the power needed to generate light is transferred from the outside of the lamp envelope by means of (electro)magnetic fields. There are two advantages of eliminating electrodes. The first is extended bulb life, because the electrodes are usually the limiting factor in bulb life. The second benefit is the ability to use light-generating substances that would react with metal electrodes in normal lamps.

Aside from the method of coupling energy into the mercury vapor, these lamps are very similar to conventional fluorescent lamps. Mercury vapor in the discharge vessel is electrically excited to produce short-wave ultraviolet light, which then excites the phosphors to produce visible light. While still relatively unknown to the public, these lamps have been available since 1990. The most common form has the shape of an incandescent light bulb. Unlike an incandescent lamp or conventional fluorescent lamps, there is no electrical connection going inside the glass bulb; the energy is transferred through the glass envelope solely by electromagnetic induction.

In the most common form, a glass tube (B) protrudes bulb-wards from the bottom of the discharge vessel (A). This tube contains an antenna called a power coupler, which consists of a coil wound over tubular ferrite core.

In lower-frequency versions of induction systems, the lamp consists of two long parallel glass tubes, connected by two short tubes that have coils mounted around them.

The antenna coils receive electric power from the electronic ballast (C) that generates a high frequency. The exact frequency varies with lamp design, but popular examples include 13.6 MHz, 2.65 MHz and 250 kHz (in physically large lamps). A special resonant circuit in the ballast produces an initial high voltage on the coil to start a gas discharge; thereafter the voltage is reduced to normal running level.

The system can be seen as a type of transformer, with the power coupler forming the primary coil and the gas discharge arc in the bulb forming the one-turn secondary coil and the load of the transformer. The ballast is connected to mains electricity, and is generally designed to operate on voltages between 100 and 277 VAC at a frequency of 50 or 60 Hz. Most ballasts can also be connected to DC voltage sources like batteries for emergency lighting purposes.

In other conventional gas discharge lamps, the electrodes are the part with the shortest life, limiting the lamp lifespan severely. Since an induction lamp has no electrodes, it can have a very long service life. For induction lamp systems with a separate ballast, the service life can be as long as 100, 000 hours, which is 11.4 years continuous operation, or 22.8 years used at night or day only. For induction lamps with integrated ballast, the life is 15, 000 to 30, 000 hours. Extremely high-quality electronic circuits are needed for the ballast to attain such a long service life. Such expensive lamps have special application areas in situations where replacement costs are high.

Research on electrodeless lamps continues, with variations in operating frequency, lamp shape, the induction coils and other design parameters. Low public awareness and relatively high prices have so far kept the use of such lamps highly specialized.

Adaptive Piezoelectric energy harvesting circuit

                   The need for a wireless electrical power supply has spurred an interest in piezoelectric energy harvesting, or the extraction of electrical energy using a vibrating piezoelectric device. Examples of applications that would benefit from such a supply are a capacitively tuned vibration absorber ,a foot-powered radio” tag and a Pico Radio .A vibrating piezoelectric device differs from a typical electrical power source in that its internal impedance is capacitive rather than inductive in nature, and that it may be driven by mechanical vibrating amplitude and frequency. While there have been previous approaches to harvesting energy generated by a piezoelectric device there has not been an attempt to develop an adaptive circuit that maximizes power transfer from the piezoelectric device. The objective of the research described herein was to develop an approach that maximizes the power transferred from a vibrating piezoelectric transducer to an electromechanical battery. The paper initially presents a simple model of piezoelectric transducer. An ac-dc rectifier is added and the model is used to determine the point of optimal power flow for the piezoelectric element. The paper then introduces an adaptive approach to achieving the optimal power flow through the use of a switch-mode dc-dc converter. This approach is similar to the so-called maximum power point trackers used to maximize power from solar cells. Finally, the paper presents experimental results that validate the technique.