Bhima's son like Gadotkach-like skeleton found. The discovery was made by National Geographic Team

Bhima's son like Gadotkach-like skeleton found. The discovery was made by National Geographic Team

source Incredible pics

Extreme Ultraviolet Lithography

Silicon has been the heart of the world's technology boom for nearly half a century, but microprocessor manufacturers have all but squeezed the life out of it. The current technology used to make microprocessors will begin to reach its limit around 2005. At that time, chipmakers will have to look to other technologies to cram more transistors onto silicon to create more powerful chips. Many are already looking at extreme-ultraviolet lithography (EUVL) as a way to extend the life of silicon at least until the end of the decade.

Potential successors to optical projection lithography are being aggressively developed. These are known as "Next-Generation Lithographies" (NGL's). EUV lithography (EUVL) is one of the leading NGL technologies; others include x-ray lithography, ion-beam projection lithography, and electron-beam projection lithography. Using extreme-ultraviolet (EUV) light to carve transistors in silicon wafers will lead to microprocessors that are up to 100 times faster than today's most powerful chips, and to memory chips with similar increases in storage capacity.

Extreme Programming

Extreme Programming (XP) is actually a deliberate and disciplined approach to software development. About six years old, it has already been proven at many companies of all different sizes and industries worldwide. XP is successful because it stresses customer satisfaction. The methodology is designed to deliver the software your customer needs when it is needed. XP empowers software developers to confidently respond to changing customer requirements, even late in the life cycle. This methodology also emphasizes teamwork. Managers, customers, and developers are all part of a team dedicated to delivering quality software. XP implements a simple, yet effective way to enable groupware style development.
XP improves a software project in four essential ways; communication, simplicity feedback, and courage. XP programmers communicate with their customers and fellow programmers. They keep their design simple and clean. They get feedback by testing their software starting on day one. They deliver the system to the customers as early as possible and implement changes as suggested. With this foundation XP programmers are able to courageously respond to changing requirements and technology. XP is different. It is a lot like a jig saw puzzle. There are many small pieces. Individually the pieces make no sense, but when combined together a complete picture can be seen. This is a significant departure from traditional software development methods and ushers in a change in the way we program.
If one or two developers have become bottlenecks because they own the core classes in the system and must make all the changes, then try collective code ownership. You will also need unit tests. Let everyone make changes to the core classes whenever they need to. You could continue this way until no problems are left. Then just add the remaining practices as you can. The first practice you add will seem easy. You are solving a large problem with a little extra effort. The second might seem easy too. But at some point between having a few XP rules and all of the XP rules it will take some persistence to make it work. Your problems will have been solved and your project is under control. It might seem good to abandon the new methodology and go back to what is familiar and comfortable, but continuing does pay off in the end. Your development team will become much more efficient than you thought possible. At some point you will find that the XP rules no longer seem like rules at all. There is a synergy between the rules that is hard to understand until you have been fully immersed. This up hill climb is especially true with pair programming, but the pay off of this technique is very large. Also, unit tests will take time to collect, but unit tests are the foundation for many of the other XP practices so the pay off is very great.

Energy transmission system for an artificial heart- leakage inductance compensation

The artificial heart now in use, like the natural heart it is designed to replace , is a four –chambered device for pumping blood. such electrical circulatory assist devices such as total artificial heart or ventricular assist devices generally use a brushless dc motor as their pump They require 12–35 W to operate and can be powered by a portable battery pack and a dc–dc converter.
It would be desirable to transfer electrical energy to these circulatory assist devices transcutaneously without breaking the skin. This technique would need a power supply which uses a transcutaneous transformer to drive  use,the motor for the circulatory assist devices. The secondary of this transformer would be implanted under the skin, and the primary would be placed on top of the secondary, external to the body. The distance between the transformer windings would be approximately equal to the thickness of the patient’s skin, nominally between 1–2 cm. This spacing cannot be assumed constant; the alignment of the cores and the distance between them would certainly vary during the operation.
A transformer with a large (1–2 cm) air gap between the primary and the secondary has large leakage inductances. In this application, the coupling coefficient k ranges approximately from 0.1 to 0.4. This makes the leakage inductances of the same order of magnitude and usually larger than the magnetizing inductance. Therefore, the transfer gain of voltage is very low, and a significant portion of the primary current will flow through the magnetizing inductance. The large circulating current through the magnetizing inductance results in poor efficiency.
A dc–dc converter employing secondary-side resonance has been reported to alleviate the problems by lowering the impedance of the secondary side using a resonant circuit .Although the circulating current is lowered, the transfer gain of the voltage varies widely as the coupling coefficient varies .So, advantages characteristics are reduced as the coupling coefficient deviates at a designated value.
In this paper, compensation of the leakage inductances on both sides of the transcutaneous transformer is presented. This converter offers significant improvements over the converter presented in the following aspects.

·         High-voltage gain with relative small variation with respect to load change as well as the variation of the coupling coefficient of the transformer—this reduces the operating frequency range and the size of the transcutaneous transformer is minimized.

·         Higher efficiency—minimize circulating current of magnetizing inductance and zero-voltage switching (ZVS) of the primary switches, and zero-current switching (ZCS) of the secondary rectifier diodes improves the efficiency significantly, especially at the secondary side (inside the body).

More ece and eee topics

Embedded System in Automobiles

We read in newspapers that a doctor had successfully transplanted a  cardiac pacemaker in his patient’s chest by sitting around 200kilometres away. Also we know about driverless cars that could take us to the destiny  by using its inbuilt navigation systems. Embedded microprocessors or  micro controllers are the brain behind these.
An embedded system is any device controlled by instructions stored  on a chip. These devices are usually controlled by a micro processor that executes the instructions stored on a  read only memory(ROM) chip.
The software for the embedded system is called firmware. The firmware will be written in assembly language for time or resource critical operations or using higher level languages like C or embedded C. The software will be simulated using micro code simulators for the target processor. Since  they are supposed to perform only specific tasks, these programs are stored in read only memories(ROMs).Moreover they may need no or minimal inputs from the user, hence  the user interface like monitor, mouse and large keyboard etc,may be  absent.
Embedded systems are computer systems that monitor, respond to, or control an external environment. This environment is connected to the computer system through sensors, actuators, and other input-output interfaces. It may consist of physical or biological objects of any form and structure. Often humans are part of the connected external    world, but a wide range of other natural and artificial objects, as well as animals are also possible.
Embedded systems are also known as real time systems since they respond to an input or event and produce the result within a guaranteed   time period. This time period can be few microseconds to days or months. The computer system must meet various timing and other constraints that are imposed on it by the real-time behavior of the external world to which it is interfaced. Hence comes the name real time. Another  Name for many of these systems is reactive systems, because their primary purpose is to respond to or react to signals from their environment. A real time computer system may be a component of a larger system in which it is embedded; reasonably such a computer component is called an embedded system.
Embedded systems control engine management systems in automobiles, monitor home heating systems and regulate the quiet operation and the even distribution of laundry in washing machines. They are the heart of toys like Furby and Tamagotchi, of golf balls that cannot get lost and of gas pumps at gasoline stations that advertise nearby restaurants on video. Above all, state-of-the art communications equipment like WAP mobile telephones, MP3 players, set-top boxes and Net devices would not be possible without these powerful miniature brains.
Applications and examples of real time systems are ubiquitous and proliferating, appearing as part of our commercial, government, military, medical, educational, and cultural infrastructures. Included are:

  • Vehicle systems for automobiles, subways, aircraft, railways and ships.

  • Traffic control for highways, airspace, railway tracks and shipping lanes.

  • Process control for power plants, chemical plants and consumer products such as soft drinks and beer.

  • Medical systems for radiation therapy, patient monitoring and defibrillation

  • Military uses such as firing weapons, tracking and command and control.

  • Manufacturing systems with robots.

  • Telephone, radio and satellite communications.

  • Computer games.

  • Multi media systems that provide text, graphic, audio and video interfaces.

  • House holds systems for monitoring and controlling appliances.

  • Building managers that controls such entities as heat, light, Doors and elevators.

Electronics Meet Animal Brains

Until recently, neurobiologists have used computers for simulation, data collection, and data analysis, but not to interact directly with nerve tissue in live, behaving animals. Although digital computers and nerve tissue both use voltage waveforms to transmit and process information, engineers and neurobiologists have yet to cohesively link the electronic signaling of digital computers with the electronic signaling of nerve tissue in freely behaving animals.
Recent advances in microelectromechanical systems (MEMS), CMOS electronics, and embedded computer systems will finally let us link computer circuitry to neural cells in live animals and, in particular, to reidentifiable cells with specific, known neural functions. The key components of such a brain-computer system include neural probes, analog electronics, and a miniature microcomputer. Researchers developing neural probes such as sub- micron MEMS probes, microclamps, microprobe arrays, and similar structures can now penetrate and make electrical contact with nerve cells with out causing significant or long-term damage to probes or cells.
Researchers developing analog electronics such as low-power amplifiers and analog-to-digital converters can now integrate these devices with micro- controllers on a single low-power CMOS die. Further, researchers developing embedded computer systems can now incorporate all the core circuitry of a modern computer on a single silicon chip that can run on miniscule power from a tiny watch battery. In short, engineers have all the pieces they need to build truly autonomous implantable computer systems.
Until now, high signal-to-noise recording as well as digital processing of real-time neuronal signals have been possible only in constrained laboratory experiments. By combining MEMS probes with analog electronics and modern CMOS computing into self-contained, implantable microsystems, implantable computers will free neuroscientists from the lab bench.

E-mail Alert System

Today we are witnessing fast changes in telecommunications computer and telephone ate two technologies that have made significant revolution in communications, but for technological reasons they were developed separately. Fast development of communication and computer technology lead to the merging of the public switched telephone network (PSTN) and the internet to become global information network of integrated services. Internet services ate becoming a more important way of information exchange and communication, turning telephony and mobile telephony toward internet services.
One of the deficiencies of internet services over fixed and mobile telephony is the availability of service: internet services are available only when connected. The results of our research carried out before the development of the e-mail Alert (EMA) System show that internet users receive on average five to six e-mails every day and 82 percent of these users in the course of their internet connection check their mail box first. Thus there is a clear demand for the development of e-mail alerting systems. EMA system is computer telephony integration (CTI) application that integrates advantages of telephony and the internet by connecting e-mail and phone services. The EMA system will inform users of the arrival of new e-mail messages, which is convenient if  you don’t  allow  e-mail servers access from outside. On the other side are internet or service providers with a large number of users.  To satisfy both groups of requirements, two versions of EMA system are proposed. The enterprise version is developed in order to allow e-mail server access inside intranet environments, while the public version is designed for public service providers. The EMA system is implemented on Win 32   platform using c and c++ programming languages HTML, ASP, java Script and VB Script are used for the Web interface to overcome deference in Web browsers.

Embedded DRAM

Even though the word DRAM has been quite common among us for many decades, the development in the field of DRAM was very slow. The storage medium reached the present state of semiconductor after a long scientific research. Once the semiconductor storage medium was well accepted by all, plans were put forward to integrate the logic circuits associated with the DRAM along with the DRAM itself. However, technological complexities and economic justification for such a complex integrated circuit are difficult hurdles to overcome. Although scientific breakthroughs are numerous in the commodity DRAM industry, similar techniques are not always appropriate when high- performance logic circuits are included on the same substrate. Hence, eDRAM pioneers have begun to develop numerous integration schemes. Two basic integration philosophies for an eDRAM technology are:

  • Incorporating memory circuits in a technology optimized  for low-Cost high performance logic.
  • Incorporating logic circuits in a technology optimized for high- Density low performance DRAM.
This seemingly subtle semantic difference significantly impacts mask count, system performance, peripheral circuit complexity, and total memory capacity of eDRAM products. Furthermore, corporations With aggressive commodity DRAM technology do not have expertise in the design of complicated digital functions and are not able to assemble a design team to complete the task of a truly merged DRAM-logic product. Conversely, small application specific integrated circuit (ASIC) design corporations, unfamiliar with DRAM- specific elements and design practice, cannot carry out an efficient merged logic design and therefore mar the beauty of the original intent to integrate. Clearly, the reuse of process technology is an enabling lhetor en route to cost-effective eDRAM technology. By the same. account, modern circuit designers should be familiar with the new elements of eDRAM technology so that they can efficiently reuse DRAM-specific structures and elements in other digital functions. The reuse of additional electrical elements is a methodology that will make eDRAM more than just a memory’ interconnected to a few million Boolean gates.
In the following sections of this report the DRAM applications and architectures that are expected to form the basis of eDRAM products are reviewed. Then a description of elements found in generic eDRAM technologies is presented so that non-memory-designers can become familiar with eDRAM specific elements and technology. Various technologies used in eDRAM are discussed. An example of eDRAM is also discussed towards the end of the report.
It can be clearly seen from this report that embedded DRAM macro extends the on-chip capacity to more than 40 MB, allowing historically off-chip memory to be integrated on chip and enabling System-on-a-Chip (SoC) designs. ‘By these memory integrated, on chips, the bandwidth is increased to a high , extend. A highly integrated DRAM approach also simplifies board design, hereby reducing overall system cost and time to market. Even, more importantly, embedding DRAM enables higher bandwidth by allowing a wider on-Chip buss and saves power by eliminating DRAM I/O.

Electronics seminar topics

1. Real Time Speech Translation
2. Cellular Neural Network
3. CorDECT
4. Augmented reality.
6. Wavelet Video Processing Technology
7. Enhanced data rates for gsm evolution edge.
8. Terahertz Waves And Applications
9. Smart Pixel Arrays
10. Fibre Optic Communication~
11. Molecular Electronics
12. Jseg-a method for unsupervised segmentation of color texture regions in images and video.
13. Asynchronous Transfer Mode
14. Molecular Finger printing
15. FinFET Technology
16. Crusoe
18. Power of Grid Computing
19. Lightning Protection Using LFAM
20. Extreme ultraviolet lithography*
21. Optical Burst Switching
22. Spintronics
23. GSM Security And Encryption
24. Augmented reality.
25. Nanotechnology
26. Eye gaze human ? computer interface.
27. Laser Communications
28. Optic Fibre Cable
29. Personal Area Network0
30. Surge Protection In Modern Devices
31. EDGE
32. Artificial Intelligence Substation Control
33. Speed Detection of moving vehicle using speed cameras
34. Microelectronic Pills~
35. FireWire
36. Search For Extraterrestrial Intelligence
37. Modern Irrigation System Towards Fuzzy
38. RAID
39. Robotics
40. Rapid Prototyping
41. Lightning Protection Using LFAM
43. MOCT
44. Convergence Of Microcontrollers And DSPs
46. Free Space Laser Communications
47. Cellular Radio
48. The Thought Translation Device (Ttd)
49. Fractal Robots
50. Quantum dots
51. Voice recognition based on artificial neural networks.
52. Cellular geolocation.
53. VT Architecture
54. A Basic Touch-Sensor Screen System
55. Digital Audio Broadcasting
57. Robotics
58. Optical Communications in Space
59. Compact peripheral component interconnect (CPCI)
60. Power over Ethernet
61. Class-D Amplifiers
62. Terrestrial Trunked Radio
63. LWIP
64. Optic Fibre Cable
65. Extreme Ultraviolet Lithography
66. Silicon Photonics
68. POwer Consumption Minimisation in Embeded Systems
69. Nanorobotics
70. The making of quantum dots.
71. Cellular Positioning
72. Extreme Ultraviolet Lithography
73. MIMO Wireless Channels: Capacity and Performance Prediction
74. Wireless Application Protocol
75. Optical Networking and Dense Wavelength Division
76. Artificial immune system.
77. BiCMOS technology
78. An Efficient Algorithm for iris pattern
79. Radio Frequency Identification (RFID)
80. Narrow Band & Broad Band ISDN
81. Fluorescent Multi-layer Disc
82. VoCable
83. Cellular technologies and security.
84. Packet Switching chips
85. Digital Light Processing
86. Landmine Detection Using Impulse Ground Penetrating Radar
87. Line-Reflect-Reflect Technique
88. EUV Lithorgaphy
89. Digital Audio's Final Frontier-Class D Amplifier
90. Wideband Sigma Delta PLL Modulator
91. Mobile Virtual Reality Service
92. Ultrasonic Trapping In Capillaries For Trace-Amount Biomedical Analysis.
93. Sensors on 3D Digitization
94. Frequency Division Multiple Access
95. Remote Accessible Virtual Instrumentation Control Lab


1. Quadrics network
2. Worldwide Inter operatibility for Microwave Access
3. Fpga offloads dsp?s.
4. Real-Time Obstacle Avoidance
5. Light emitting polymers
6. E-Commerce
7. Extreme ultraviolet lithography*
8. Low Power UART Design for Serial Data Communication
9. Multi threading microprocessors
10. Passive Millimeter-Wave
11. Magnetic Resonance Imaging
12. Microelectronic Pills~
13. Multisensor Fusion and Integration
14. Molecular Electronics
15. Money Pad, The Future Wallet
16. Treating Cardiac Disease With Catheter-Based Tissue Heating
17. Adaptive Multipath Detection4
18. Heliodisplay
19. Virtual Reality~
20. Real Time System Interface
21. Wireless LED
22. Real-Time Image Processing Applied To Traffic
23. Class-D Amplifiers
24. Radiation Hardened Chips
25. Time Division Multiple Access
26. Embryonics Approach Towards Integrated Circuits
27. Cellular Digital Packet Data (Cdpd)
28. EC2 Technology
29. Crusoe Processor
30. Swarm intelligence & traffic Safety
31. Software Radio3
32. Integrated Power Electronics Module
33. Power System Contingencies
34. e-Paper Display
36. Push Technology
37. Distributed Integrated Circuits
38. Electronics Meet Animal Brains
39. Navbelt and Guidicane
40. Orthogonal Frequency Division Multiplexing
41. Organic LED
42. Optical networking
43. Tunable Lasers
44. Code Division Duplexing
45. Satellite Radio TV System
46. Code Division Multiple Access
47. Project Oxygen
48. Robotic balancing..
49. Integer Fast Fourier Transform
50. Daknet
51. Cryptography~
52. 3- D IC's
53. Continuously variable transmission (CVT)
54. Fibre Optic Communication~
55. AC Performance Of Nanoelectronics
56. Continuously variable transmission (CVT)
57. Intel express chipsets.
58. Military Radars
59. Moletronics- an invisible technology
60. Significance of real-time transport Protocol in VOIP
61. Acoustics
62. Testing cardiac diseased based on catheter based tissue heating
63. Cellular Through Remote Control Switch
64. Touch Screens
65. Implementation Of Zoom FFT in Ultrasonic Blood Flow Analysis
66. FRAM
67. The Bionic Eye
68. Synchronous Optical Network
69. Satellite Radio
70. Nanotechnology
71. Fault Diagnosis Of Electronic System using AI
72. Asynchronous Chips
73. E-Nose
74. Holographic Data Storage
76. Crystaline Silicon Solar Cells
77. Space Robotics
78. Guided Missiles
79. Synchronous Optical Networking
80. Cyberterrorism
81. Plasma Antennas
82. Welding Robots
83. Laser Communications
84. Architectural requirements for a DSP processer
85. High-availability power systems Redundancy options
86. Utility Fog
88. DSP Processor
89. e-governance.
90. Smart Pixel Arrays
91. The mp3 standard.
92. Resilient Packet Ring RPR.
93. Fast convergemce algorithms for active noise control in vehicles
94. Thermal infrared imaging technology
96. ISO Loop magnetic couplers
97. Evolution Of Embedded System
98. Guided Missiles
99. Iris Scanning
100. QoS in Cellular Networks Based on MPT
101. Vertical Cavity Surface Emitting Laser
102. Driving Optical Network Evolution
103. Home Audio Video Interpretability (HAVi)
104. Sensotronic Brake Control
105. Cruise Control Devices
106. Zigbee - zapping away wired worries
107. Global Positioning System~
108. Passive Millimeter-Wave
109. High-availability power systems Redundancy options
110. Light emitting polymers
111. Advanced Mobile Presence Technology
112. Resilient packet ring rpr.
113. Electronic Road Pricing System~
114. CorDECT
115. Artificial neural networks based Devnagri numeral recognitions by using S.O.M
116. Dig Water
117. Fusion Memory
118. Military Radars
119. Satellite Radio TV System
120. Landmine Detection Using Impulse Ground Penetrating Radar
121. low Quiescent current regulators
122. Stream Processor
123. Wireless communication
124. Object Oriented Concepts
125. Internet Protocol Television
127. MOCT
128. VLSI Computations
129. Terahertz Transistor
130. Integer Fast Fourier Transform
131. Surface Mount Technology
132. The Vanadium Redox Flow Battery System5
133. Terrestrial Trunked Radio
134. Fuzzy Logic
135. Dual Energy X-ray Absorptiometry
136. Cellular technologies and security.
137. Automatic Number Plate Recognition
138. Turbo codes.
139. CRT Display
140. HVAC
141. Ultra wide band technology.
142. GPRS
143. Optical Switching
144. VCSEL
145. Organic Light Emitting Diode
146. Orthogonal Frequency Division Multiplexing
147. Time Division Multiple Access
148. Elliptical curve cryptography ECC
149. Service Aware Intelligent GGSN
150. Space Time Adaptive Processing
151. Wireless LED
152. Blast
153. Radio Astronomy
154. Quantum cryptography
155. Organic Electronic Fibre
156. Fundamental Limits Of Silicon Technology
157. Digital Audio's Final Frontier-Class D Amplifier
158. Bluetooth based smart sensor networks
159. Optical Camouflage
160. Artifical Eye
161. Digital Imaging~
162. RFID Radio Frequency Identification


Organizations have, over the years, successfully employed business intelligence tools like OLAP and data warehousing to improve the supply of business information to end users for cross industry applications like finance and customer relationship management, and in vertical markets such as retail, manufacturing, healthcare, banking, financial services, telecommunications, and utilities. In the recent years, the Internet has opened up an entirely new channel for marketing and selling products. Companies are taking to e-business in a big way. The issue facing end users as organizations deploy e-business systems is that they do have not had the same business intelligence capabilities available to them in e-business systems as they do in the traditional corporate operating environment. This prevents businesses from exploiting the full power of the Internet as a sales and marketing channels.
As a solution, vendors are now developing business intelligence applications to capture and analyze the information flowing through e-business systems, and are developing Web-based information portals that provide an integrated and personalized view of enterprise-wide business information, applications, and services. This advanced business intelligence systems are called E-intelligence systems

Coffee Analysis With An Electronic Nose

ELECTRONIC Noses (EN), in the broadest meaning, are instruments that analyze gaseous mixtures for discriminating between different (but similar) mixtures and, in the case of simple mixtures, quantify the concentration of the constituents. ENs consists of a sampling system (for a reproducible collection of the mixture), an array of chemical sensors, Electronic circuitry and data analysis software. Chemical sensors, which are the heart of the system, can be divided into three categories according to the type of sensitive material used: inorganic crystalline materials (e.g. semiconductors, as in MOSFET structures, and metal oxides); organic materials and polymers; biologically derived materials.
The use of ENs for food quality analysis tasks is twofold. ENs is normally used to discriminate different classes of similar odour-emitting products. In particular ENs already served to distinguish between different coffee blends and between different coffee roasting levels. On the other hand, ENs can also be used to predict sensorial descriptors of food quality as determined by a panel (often one generically speaks of correlating EN and sensory data). ENs can therefore represent a valid help for routine food analysis.
The combination of gas chromatography and mass spectroscopy (GC-MS) is by far the most popular technique for the identification of volatile compounds in foods and beverages. This is because the separation achieved by the gas chromatographic technique is complemented by the high sensitivity of mass spectroscopy and its ability to identify the molecules eluting from the column on the basis of their fragmentation patterns. Detection limits as low as 1 ppb (parts per billion) are frequently reached. The main drawbacks of the approach are, however, the cost and complexity of the instrumentation and the time required to fully analyze each sample (around one hour for a complete chromatogram). Comparatively, ENs are simpler, cheaper devices. They recognize a fingerprint, that is global information, of the samples to be classified. For food products, the sensory characteristics determined by a panel are important for quality assessment. While man still is the most efficient instrument for sensorial evaluation, the formation of a panel of trained judges involves considerable expenses.
Commercial coffees are blends, which, for economic reasons, contain (monovarietal) coffees of various origins. For the producers the availability of analysis and control techniques is of great importance. There exists a rich literature on the characterization of coffee using the chemical profile of one of its fractions, such as the headspace of green or roasted beans or the phenolic fraction. In the literature up to 700 diverse molecules have been identified in the headspace. Their relative abundance depends on the type, provenance and manufacturing of the coffee. It is to be noticed that none of these molecules can alone be identified as a marker. On the contrary one has to consider the whole spectrum, as for instance the gas chromatographic profile.

Electrical and chemical diagnostics of transformer insulation

The main function of a power system is to supply electrical energy to its customers with an acceptable degree of reliability and quality. Among many other things, the reliability of a power system depends on trouble free transformer operation. Now, in the electricity utilities around the world, a significant number of power transformers are operating beyond their design life. Most of these transformers are operating without evidence of distress. The same situation is evident in Australia. In PowaaerLink Queensland (PLQ), 25% of the power transformers were more than 25 years old in 1991. So priority attention should be directed to research into improved diagnostic techniques for determining the condition of the insulation in aged transformers.
The insulation system in a power transformer consists of cellulosic materials (paper, pressboard and transformerboard) and processed mineral oil. The cellulosic materials and oil insulation used in transformer degrade with time. The degradation depends on thermal, oxidative, hydrolytic, electrical and mechanical conditions which the transformer experienced during its lifetime.
The condition of the paper and pressboard insulation has been monitored by (a) bulk measurements (dissolved gas analysis (DGA) insulation resistance (IR), tanö and furans and (b) measurements on samples removed from the transformer (degree of polymerization (DP) tensile strength). At the interface between the paper and oil in the transformer, interfacial polarization may occur, resulting in an increase in the loss tangent and dielectric loss. A DC method was developed for measuring the interfacial polarization spectrum for the determination of insulation condition in aged transformers.
This paper makes contributions to the determination of the insulation condition of transformers by bulk measurements and measurements on samples removed from the transformer. It is based on a University of Queensland research project conducted with cooperation from the PLQ and the GEC-Alsthom.
Most of the currently used techniques have some drawbacks. Dissolved gas analysis requires a data bank based on experimental results from failed transformers for predicting the fault type. When transformer oil is rep or refurbished, the analysis of furans in the refurbished oil may not show any trace of degradation, although the cellulose may have degraded significantly. DP estimation is based on a single-point viscosity measurement. Molecular weight studies by single-point viscosity measurements are of limited value when dealing with a complex polymer blend, such as Kraft paper, particularly in cases where the molecular weight distribution of the paper changes significantly as the degradation proceeds. In these instances, a new technique, gel permeation chromatography (GPC), is likely to be more useful than the viscosity method, because it provides information about the change in molecular weight and molecular weight distribution. Investigation of the GPO technique has been included in this research to assess its effectiveness in determining the condition of insulation.
Conventional electrical properties (dissipation factor and breakdown strengths) of cellulosic materials are not significantly affected by ageing .so very little recent research has been directed to electrical diagnostic techniques, in this research project, thorough investigations were also undertaken of the conventional electrical properties, along with interfacial polarization parameters of the cellulosic insulation materials. The interfacial phenomena are strongly influenced by insulation degradation products, such as polar functionalities, water etc. The condition of the dielectric and its degradation due to ageing can be monitored by studying the rate and process of polarization and can be studied using a DC field. Furthermore, this is a non-destructive diagnostic test.
A retired power transformer (25 MVA, l1/132 kV) and several distribution transformers were used for the experimental work. The results from these transformers will be presented and an attempt will be made to correlate the electrical and chemical test results. The variation of the results through the different locations in a power transformer will be discussed with reference to their thermal stress distribution. Accelerated ageing experiments were conducted to predict the long term insulation behaviour and the results are presented in the accompanying paper.

E bomb

The next Pearl Harbor will not announce itself with a searing flash of nuclear light or with the plaintive wails of those dying of Ebola or its genetically engineered twin. You will hear a sharp crack in the distance. By the time you mistakenly identify this sound as an innocent clap of thunder, the civilized world will have become unhinged. Fluorescent lights and television sets will glow eerily bright, despite being turned off. The aroma of ozone mixed with smoldering plastic will seep from outlet covers as electric wires arc and telephone lines melt. Your Palm Pilot and MP3 player will feel warm to the touch, their batteries overloaded. Your computer, and every bit of data on it, will be toast. And then you will notice that the world sounds different too. The background music of civilization, the whirl of internal-combustion engines, will have stopped. Save a few diesels, engines will never start again. You, however, will remain unharmed, as you find yourself thrust backward 200 years, to a time when electricity meant a lightning bolt fracturing the night sky. This is not a hypothetical, son-of-Y2K scenario. It is a realistic assessment of the damage that could be inflicted by a new generation of weapons--E-bombs.
Anyone who's been through a prolonged power outage knows that it's an extremely trying experience. Within an hour of losing electricity, you develop a healthy appreciation of all the electrical devices you rely on in life. A couple hours later, you start pacing around your house. After a few days without lights, electric heat or TV, your stress level shoots through the roof. But in the grand scheme of things, that's nothing. If an outage hits an entire city, and there aren't adequate emergency resources, people may die from exposure, companies may suffer huge productivity losses and millions of dollars of food may spoil. If a power outage hit on a much larger scale, it could shut down the electronic networks that keep governments and militaries running. We are utterly dependent on power, and when it's gone, things get very bad, very fast.
An electromagnetic bomb, or e-bomb, is a weapon designed to take advantage of this dependency. But instead of simply cutting off power in an area, an e-bomb would actually destroy most machines that use electricity. Generators would be useless, cars wouldn't run, and there would be no chance of making a phone call. In a matter of seconds, a big enough e-bomb could thrust an entire city back 200 years or cripple a military unit.