Author Archives: feq12001

ECE Seminar Series: Quantum Dot Channel FETs and Nonvolatile Memories: Fabrication and Modeling

ECE title

ECE Seminar Series Fall 2017

Wednesday November 15th 2:30pm-3:30 PM, GENT 103

Quantum Dot Channel FETs and Nonvolatile Memories: Fabrication and Modeling

Dr. Jun Kondo

Abstract: This talk presents modeling and fabrication of quantum dot channel field effect transistors (QDC-FETs) using cladded Ge quantum dots on poly-Si thin films grown on silicon-on-insulator (SOI) substrtes. HfAlO2 high-k dielectric layers are used for the gate dielectric. QDC-FETs exhibit multi-state I-V characteristics which enable two-bit processing, and reduce FET count and power dissipation, and are expected to make a significant impact on the digital circuit design. Quntum dot channel FETs are also configured as floating gate quatnum dot nonvolatile memories (QDC-QDNVMs). In NVMs, we use floating gate comprising of GeOx-Ge quantum dots. QD nonvolatile memories (QD-NVMs) are fabricated on polysilicon thin films usingn SOI substrates. HfAlO2 high-k insulator laeyrs are used for both tunnel gate oxide as well as conhtrol gate dielectric. QDC-NVMs provide not only significantly higher ID current flow, but also significantly higher threshold voltage shifts which improve the threshold voltage variation, and show the potential for fabricating multi-bit nonvolatile memories.

Portable Microscope Makes Field Diagnosis Possible

Siddharth Rawat, left, a Ph.D. student, and Bahram Javidi, Board of Trustees Distinguished Professor of Electrical and Computer Engineering, operate a prototype device to examine blood samples for diseases. The portable holographic field microscope offers medical professionals a fast and reliable tool for the identification of diseased cells. (Peter Morenus/UConn Photo)


A portable holographic field microscope developed by UConn optical engineers could provide medical professionals with a fast and reliable new tool for the identification of diseased cells and other biological specimens.

The device, featured in a recent paper published by Applied Optics, uses the latest in digital camera sensor technology, advanced optical engineering, computational algorithms, and statistical analysis to provide rapid automated identification of diseased cells.

One potential field application for the microscope is helping medical workers identify patients with malaria in remote areas of Africa and Asia where the disease is endemic.

Quick and accurate detection of malaria is critical when it comes to treating patients and preventing outbreaks of the mosquito-borne disease, which infected more than 200 million people worldwide in 2015, according to the Centers for Disease Control. Laboratory analysis of a blood sample remains the gold standard for confirming a malaria diagnosis.  Yet access to trained technicians and necessary equipment can be difficult and unreliable in those regions.

The microscope’s potential applications go far beyond the field diagnosis of malaria. The detailed holograms generated by the instrument also can be used in hospitals and other clinical settings for rapid analysis of cell morphology and cell physiology associated with cancer, hepatitis, HIV, sickle cell disease, heart disease, and other illnesses, the developers say.

In checking for the presence of disease, most hospitals currently rely on dedicated laboratories that conduct various tests for cell analysis and identification. But that approach is time consuming, expensive, and labor intensive. It also has to be done by skilled technicians working with the right equipment.

“Our optical instrument cuts down the time it takes to process this information from days to minutes,” says Bahram Javidi, Board of Trustees Distinguished Professor in the Department of Electrical and Computer Engineering and the microscope’s senior developer. “And people running the tests don’t have to be experts, because the algorithms will determine if a result is positive or negative.”

The research team consulted with hematologists, and the algorithms used with the instrument are able to compare a sample against the known features of healthy cells and the known features of diseased cells in order to make proper identification. “It’s all done very quickly,” Javidi says.

How the Device Works

When it comes to identifying patients with malaria, here’s how the device works: A thin smear from a patient’s blood sample is placed on a glass side, which is put under the microscope for analysis. The sample is exposed to a monochromatic light beam generated by a laser diode or other light source. Special components and optical technologies inside the microscope split the light beam into two beams in order to record a digital hologram of the red blood cells in the sample. An image sensor, such as a digital webcam or cell phone camera, connected to the 3-D microscope captures the hologram.  From there, the captured data can be transferred to a laptop computer or offsite laboratory database via the internet. Loaded with dedicated algorithms, the computer or mobile device hardware reconstructs a 3-D profile of the cell and measures the interaction of light with the cell under inspection. Any diseased cells are identified using computer pattern recognition software and statistical analysis.

Quantitative phase profiles of healthy red blood cells (top row) and malaria infected cells (bottom row). (Holographic microscope image courtesy of Bahram Javidi)

Red blood cells infected with the malaria-causing Plasmodium parasite exhibit different properties than healthy blood cells when light passes through them, Javidi says.

“Light behaves differently when it passes through a healthy cell compared to when it passes through a diseased cell,” Javidi says. “Today’s advanced sensors can detect those subtle differences, and it is those nanoscale variations that we are able to measure with this microscope.”

Conventional light microscopes only record the projected image intensity of an object, and have limited capability for visualizing the detailed quantitative characterizations of cells. The digital holograms acquired by UConn’s 3-D microscope, on the other hand, capture unique micro and nanoscale structural features of individual cells with great detail and clarity. Those enhanced images allow medical professionals and researchers to measure an individual cell’s thickness, volume, surface, and dry mass, as well as other structural and physiological changes in a cell or groups of cells over time – all of which can assist in disease identification, treatment, and research. For instance, the device could help researchers see whether new drugs impact cells positively or negatively during clinical trials.

The techniques associated with the holographic microscope also are non-invasive, highlighting its potential use for long-term quantitative analysis of living cells.

Conventional methods of testing blood samples for disease frequently involve labeling, which means the sample is treated with a chemical agent to assist with identification. In the case of malaria, red blood cells are usually treated with a Giemsa stain that reacts to proteins produced by malaria-carrying parasites and thus identifies them. But introducing a chemical into a live cell can change its behavior or damage it.

“If you’re doing an in vitro inspection of stem cells, for instance, and you introduce a chemical agent, you risk damaging those cells. And you can’t do that, because you may want to introduce those cells into the human body at some point,” Javidi says. “Our instrument doesn’t rely on labeling, and therefore avoids that problem.” 

Ph.D. students Tim O’Connor ’17 (ENG), left, Siddharth Rawat, and Adam Markman ’11 (ENG) operate a prototype device to examine blood samples for diseases at the Javidi lab in the Information Technologies Engineering Building. (Peter Morenus/UConn Photo)

The holographic microscope was developed in UConn’s new Multidimensional Optical Sensing & Imaging Systems or MOSIS lab, where Javidi serves as director. The MOSIS lab integrates optics, photonics, and computational algorithms and systems to advance the science and engineering of imaging from nano to macro scales.

A comprehensive report on the MOSIS lab’s work with 3-D optical imaging for medical diagnostics was published last year in Proccedings of the IEEE, the top-ranked journal for electrical and electronics engineering. Joining Javidi in this research are graduate students Adam Markman, Siddharth Rawat, Satoru Komatsu, and Tim O’Connor from UConn; and Arun Anand, an applied optics specialist with Maharaja Sayajirao University of Baroda in Vadodara, India.

The microscope research is supported by Nikon and the National Science Foundation (ECCS 1545687). Students are supported by the U.S. Department of Education, GE, and Canon fellowships. Other sponsors that have supported Javidi’s broader research work and the MOSIS lab over the years include the Defense Advanced Research Projects Agency or DARPA, the U.S. Airforce Research Lab, the U.S. Army, the Office of Naval Research, Samsung, Honeywell, and Lockheed Martin. He has collaborated with colleagues from numerous universities and industries around the world during his time at UConn, including research facilities in Japan, Korea, China, India, Germany, England, Italy, Switzerland, and Spain, among other countries.

Javidi is working with colleagues at UConn Health, including medical oncology and hematology specialist Dr. Biree Andemariam and her staff, for other medical applications. UConn’s tech commercialization office has been involved in discussing potential marketing opportunities for the portable digital microscope. A prototype of the microscope used for initial tests was assembled using 3-D printing technologies, lowering its production costs.

 

Original from UConn Today, Colin Poitras.

Navy Using New UConn Software to Improve Navigation

The Navy is using new software developed by UConn engineering professor Krishna Pattipati to vastly improve the ability to route ships through unpredictable situations.


Major research discoveries generate news headlines. But a research undertaking by one University of Connecticut engineering lab seeks to forestall some headlines of a different kind.

The loss of life because of weather events, as happened on Oct. 1, 2015 when cargo ship El Faro sank with its 33-member crew in Hurricane Joaquin, is one example. Transcripts released by the National Transportation Safety Board showed an increasingly anxious and panicked crew as the 790-foot vessel sailed into the raging storm two years ago.

Software developed by Krishna Pattipati, UTC Professor in Systems Engineering at UConn and his research team, in collaboration with the U.S. Naval Research Laboratory-Monterey, may go a long way toward avoiding such tragedies.

The prototype, named TMPLAR (Tool for Multi-objective Planning and Asset Routing), is now being used by the Navy to vastly improve the ability of ships to reroute through unpredictable weather. It is the type of technology transition that the new National Institute for Undersea Vehicle Technology based at UConn Avery Point, is now able to foster.

Screenshot of a requested ship transit from San Diego, California, toward Alaska. The black line is the suggested route the Navy navigator is given to accept or reject and send on as directions to a ship’s captain. The numbered red circles are ‘waypoints’ along the route, with the starting point labeled ‘0’. These waypoints divide up a possibly long voyage and keep the ship’s path in check.

Created by Pattipati and electrical and computer engineering graduate students David Sidoti, Vinod Avvari, Adam Bienkowski, and Lingyi Zhang, and undergraduate students Matthew Macesker and Michelle Voong, TMPLAR is still in development, but it has already been fully integrated with the Navy’s meteorology and oceanographic weather forecasts.

Members of the UConn team meet weekly with Navy officials, via teleconference, to discuss project updates and receive  feedback.

“Their progress is fast,” says Sidoti. “Frankly, it’s kept us on our toes as we try to manage both our academic responsibilities here at UConn while enhancing and updating the software.”

TMPLAR is like a much more complex version of Google Maps, because it will be applied to ships and submarines, where there is no underlying network of roadways to navigate.

In Google Maps, a user typically seeks to maximize the average speed of travel between start and end locations to get to a destination in the shortest amount of time, hence the route may favor highways instead of back roads.

Pattipati’s team is now approaching problems with upwards of 17 or more objectives, which may change depending on the vehicle and the conditions.

The algorithms take into account obstacles such as ocean depth, undersea pipelines, cables, oil rigs, for example. And they factor in multiple user objectives, whether to traverse to an area to minimize travel time, maximize fuel efficiency given the predicted weather, accomplish training objectives, or maximize operational endurance.

“The tool guarantees safe travel from any point in the ocean, above, on, or below its surface, while making choices en route that optimize fuel consumption and cater to any set of objectives of the operator,” says Sidoti. “Using special clustering techniques, the tool’s algorithms have even been applied to finding low-risk routes that avoid storms or hurricanes.”

The next step for TMPLAR is programming the tool for use by aircraft, such as drones.

Last month, Pattipati and Sidoti traveled to San Diego to demonstrate the capabilities of the software to the Space and Naval Warfare Systems Center Pacific. Their algorithim is now going to be integrated with a tool for aircraft carrier strike group planning.

The lab first published details about the software last year in the journal IEEE, the world’s largest professional organization for the advancement of technology. Avvari, one of the graduate students, will detail some of the enhancements that have been made since then at an upcoming professional conference.

And, as the software transitions to operational settings, the team is looking to speed up the capabilities to output smart weather-informed route recommendations in less than a second. Adding neural network modules to TMPLAR is another new horizon; artificial intelligence would help condense solutions so it is less overwhelming to a user, says Sidoti.

When he reviewed the factors faced by the crew of El Faro using TMPLAR software, Sidoti was able to find safe routes for the ship that involved waiting at waypoints and varying the ship’s speed in order to avoid unsafe environmental conditions, while also reducing costs of the route.

The Coast Guard’s report on the tragedy – released just a month ago – said the captain misjudged the strength of Hurricane Joaquin and should have changed the El Faro’s course.

Sidoti found up to eight possible safe routes using TMPLAR. That’s the sort of information he hopes other captains will have.

Recently, the team received notification that the software was demo’ed to onboard ship navigators who were interested to the point that they requested the ability to use it in order to plan and test it on a real-world deployment.

Funding for this research is supported by the U.S. Office of Naval Research under contracts #N00014-16-1-2036 and #N00014-12-1-0238; by the Naval Research Laboratory under contract #N00173-16-1-G905; and by the Department of Defense High Performance Computing Modernization Program under subproject contract #HPCM034125HQU.

 

Original from UConn Today, Kristen Cole.

Award-winning Paper Questions ECG As Secure Biometric

A paper from UConn fourth year PhD student Nima Karimian has won the best student paper award at the recent IJCB 2017 conference in Denver.

The Conference

The International Joint Conference on Biometrics (IJCB 2017) combines two major biometrics research annual conferences, the Biometrics Theory, Applications and Systems (BTAS) conference and the International Conference on Biometrics (ICB). The blending of these two conferences in 2017 is through special agreement between the IEEE Biometrics Council and the IAPR TC-4, and presents an exciting event for the entire worldwide biometrics research community.

The Paper

The paper, “On the Vulnerability of ECG Verification to Online Presentation Attacks,” examined the use of Electrocardiogram (ECG) as a secure biometric modality. ECG has long been regarded as a biometric modality which is impractical to copy, clone, or spoof. However, it was recently shown that an ECG signal can be replayed from arbitrary waveform generators, computer sound cards, or off-the-shelf audio players. The award-winning paper is one of the first in the field to seriously question the security of ECG verification, and goes a long way towards debunking the assumption of its security.

The paper developed a novel presentation attack where a short template of the victim’s ECG is captured by an attacker and used to map the attacker’s ECG into the victim’s, which can then be provided to the sensor using one of the above sources. The authors’ approach involved exploiting ECG models, characterizing the differences between ECG signals, and developing mapping functions that transform any ECG into one that closely matches an authentic user’s ECG. Their proposed approach, which can operate online or on-the-fly, is compared with a more ideal offline scenario where the attacker has more time and resources. In the experiments, the offline approach achieved average success rates of 97.43% and 94.17% for non-fiducial and fiducial based ECG authentication. In the online scenario, the performance is degraded by 5.65% for non-fiducial based authentication, but is nearly unaffected for fiducial authentication.

The work was supported by US Army Research Office (ARO) under award number W911NF16-1-0321.

ECE Seminar Series: Energy Harvesting (EH) Opportunities in 5G

ECE title

ECE Seminar Series Fall 2017

Monday October 16th 11:00am-12:00 PM, ITE 401

Energy Harvesting (EH) Opportunities in 5G

Brian Zahnstecher

PowerRox

Abstract: We have all seen plenty of the marketing hype for what will eventually become 5G in the ~2020 production deployment timeframe. At PowerRox, we have spent the last year trying to shed light on the most critical aspects of the network (from architecture to utilization), which are the paradigm shifts in power electronics and power utilization required to enable 5G. Now, we can investigate further into one of the most interesting, yet highly underappreciated, opportunities in 5G power…energy harvesting (EH). Everyone likes the thought of free, ambient energy, but most think this technology neither produces a usable amount of power nor has a production ecosystem mature enough for a telecom deployment. This talk will not only help to dispel these perceptions, but also help open the eyes of attendees to applications that they might not have otherwise thought were possible and/or applicable to telco applications (with a special focus on 5G).

Energy harvesting (EH) presents a host of interesting and useful applications that can be utilized today as well as provides a roadmap for enhancing/increasing application use cases moving forward. From mW to MW, there are scalable EH technologies to take advantage of nearly every energy source physics affords us (i.e. – kinetic, thermal, RF, photovoltaic, piezoelectric, vibrational, etc.). The major shift from macro towers to heterogeneous networks (HetNets) of many small cells makes 5G an ideal candidate for EH applications. Battery mitigation is a key goal of EH technology initially by supplementing battery power to extend battery life and eventually disposing of them altogether. Even security at many network points from the base station to the grid-level can benefit from EH by achieving grid independence and/or inhibiting undesired network penetration.

This talk will provide attendees with a wealth of knowledge and thought-provoking insight on how EH can be applied to 5G and creative applications beyond. First, we will provide a quick overview of EH sources/technologies, and review the transducers, power management ICs (PMICs)/topologies, energy storage, and test/measurement solutions that make up the production ecosystem. Then, we will deepdive on the implementation of these constituents into practical power electronics solutions and see how we can scale (even μW) to more usable power levels. Finally, we will close with a number of quick case studies on how to apply EH to 5G (and related) applications at all levels of the network from data center to edge. Additionally, we will look at some more unique applications (i.e. – Security) within 5G that EH is a key enabler for.


Short Bio
: Brian Zahnstecher is a Sr. Member of the IEEE, Chair of the IEEE SF Bay Area Power Electronics Society (PELS), and the Principal of PowerRox, where he focuses on power design, integration, system applications, OEM market penetration, and private seminars for power electronics. He has successfully handled assignments in system design/architecting, AC/DC front-end power, EMC/EMI design/debug, embedded solutions, processor power, and digital power solutions for a variety of clients. He previously held positions in power electronics with industry leaders Emerson Network Power, Cisco, and Hewlett-Packard, where he advised on best practices, oversaw product development, managed international teams, created/enhanced optimal workflows and test procedures, and designed and optimized voltage regulators. He has been a regular contributor to the industry as an invited speaker, author, workshop participant, session host, roundtable moderator, and volunteer. He has over 13 years of industry experience and holds Master of Engineering and Bachelor of Science degrees from Worcester Polytechnic Institute.

PowerRox is a firm dedicated to solving power problems for those seeking to establish or enhance their position in the enterprise and consumer power electronics marketplace. We specialize in improving efficiency, increasing reliability, achieving cost reduction through hands-on support and training/seminars/workshops. We can solve problems in power supply design, power system development, system debug and test, cost/performance analysis, marketing, and re-design. We are committed to meeting all deadlines, performing on-budget, debugging/testing solutions to required levels, and doing the highest quality work possible.

ECE Seminar Series: Resiliency and Security of the Future Power Grid

ECE title

ECE Seminar Series Fall 2017

(co-sponsor with Eversource Energy Center)

Monday October 16th 1:00-2:00 PM, LH 201

Resiliency and Security of the Future Power Grid

Chen-Ching Liu

Boeing Distinguished Professor
Director, Energy Systems and Innovation Center (ESIC)
School of Electrical Engineering & Computer Science
Washington State University

Abstract: The development of smart grid in the U.S. over the last decade significantly enhanced data acquisition capabilities on the transmission system. For the distribution network, numerous remote control devices and voltage/var control systems have been installed and millions of smart meters are now operational on the customer side. Although the level of automation has been improved, there are great challenges in the grid’s ability to withstand extreme events such as catastrophic hurricanes and earthquakes. Resiliency of the future grid can be achieved by enabling flexible reconfiguration with distributed resources, e.g., microgrid, distributed generations, as well as renewable and storage devices. Advanced and distributed operation and control will be critical for the vision. Fast increasing connectivity of the devices and systems on the power grid also led to a serious concern over the security of the complex cyber-physical system. Progress has been made in developing new technologies for cyber security of the power grid, including monitoring, vulnerability assessment, intrusion detection, and mitigation


Short Bio
: Chen-Ching Liu is Boeing Distinguished Professor at Washington State University (WSU), Pullman, WA. At WSU, Professor Liu served as Director of the Energy Systems Innovation Center. During 1983-2005, he was a Professor of Electrical Engineering at University of Washington, Seattle. Dr. Liu was Palmer Chair Professor at Iowa State University from 2006 to 2008. From 2008-2011, he served as Acting/Deputy Principal of the College of Engineering, Mathematical and Physical Sciences at University College Dublin, Ireland. Professor Liu received an IEEE Third Millennium Medal in 2000 and the Power and Energy Society Outstanding Power Engineering Educator Award in 2004. In 2013, Dr. Liu received a Doctor Honoris Causa from Polytechnic University of Bucharest, Romania. Chen-Ching chaired the IEEE Power and Energy Society Fellow Committee, Technical Committee on Power System Analysis, Computing and Economics, and Outstanding Power Engineering Educator Award Committee. He served on the U.S. National Academies Board on Global Science and Technology. Professor Liu is a Fellow of the IEEE and Member of the Washington State Academy of Sciences.

ECE Seminar Series: A New Look at Optimal Control of Wireless Networks

ECE title

ECE Seminar Series Fall 2017

Friday October 13th 2:30-3:30 PM, ITE 119

A New Look at Optimal Control of Wireless Networks

Eytan Modiano

Laboratory for Information and Decision Systems
Massachusetts Institute of Technology

Abstract: We address the problem of throughput-optimal packet dissemination in wireless networks with an arbitrary mix of unicast, broadcast, multicast and anycast traffic. We start with a review of the seminal work of Tassiulas and Ephremides on optimal scheduling and routing of unicast traffic, i.e., the famous backpressure algorithm. The backpressure algorithm maximizes network throughput, but suffers from high implementation complexity, and poor delay performance due to packets looping inside the network. Moreover, backpressure routing is limited to unicast traffic, and cannot be used for broadcast or multicast traffic. We will describe a new online dynamic policy, called Universal Max-Weight (UMW), which solves the above network flow problems simultaneously and efficiently. To the best of our knowledge, UMW is the first throughput-optimal algorithm for solving the generalized network-flow problem. When specialized to the unicast setting, the UMW policy yields a throughput-optimal, loop-free, routing and link-scheduling policy. Extensive simulation results show that the proposed UMW policy incurs substantially smaller delays as compared to backpressure.


Short Bio
: Eytan Modiano received his B.S. degree in Electrical Engineering and Computer Science from the University of Connecticut at Storrs in 1986 and his M.S. and PhD degrees, both in Electrical Engineering, from the University of Maryland, College Park, MD, in 1989 and 1992 respectively. He was a Naval Research Laboratory Fellow between 1987 and 1992 and a National Research Council Post Doctoral Fellow during 1992-1993. Between 1993 and 1999 he was with MIT Lincoln Laboratory. Since 1999 he has been on the faculty at MIT, where he is a Professor and Associate Department Head in the Department of Aeronautics and Astronautics, and Associate Director of the Laboratory for Information and Decision Systems (LIDS). His research is on communication networks and protocols with emphasis on satellite, wireless, and optical networks. He is the co-recipient of the MobiHoc 2016 best paper award, the Wiopt 2013 best paper award, and the Sigmetrics 2006 Best paper award. He is the Editor-in-Chief for IEEE/ACM Transactions on Networking, and served as Associate Editor for IEEE Transactions on Information Theory and IEEE/ACM Transactions on Networking. He was the Technical Program co-chair for IEEE Wiopt 2006, IEEE Infocom 2007, ACM MobiHoc 2007, and DRCN 2015. He is a Fellow of the IEEE and an Associate Fellow of the AIAA, and served on the IEEE Fellows committee.

Teaching Robots to Think

Original Author: Colin Poitras – UConn Communications – September 13, 2017

Ashwin Dani, assistant professor of electrical and computer engineering, demonstrates how the robot can be given a simple task which can be repeated. Sept. 7, 2017. (Sean Flynn/UConn Photo)

In a research building in the heart of UConn’s Storrs campus, assistant professor Ashwin Dani is teaching a life-size industrial robot how to think.

Here, on a recent day inside the University’s Robotics and Controls Lab, Dani and a small team of graduate students are showing the humanoid bot how to assemble a simple desk drawer.

The “eyes” on the robot’s face screen look on as two students build the wooden drawer, reaching for different tools on a tabletop as they work together to complete the task.

The robot may not appear intently engaged. But it isn’t missing a thing – or at least that’s what the scientists hope. For inside the robot’s circuitry, its processors are capturing and cataloging all of the humans’ movements through an advanced camera lens and motion sensors embedded into his metallic frame.

Ashwin Dani, assistant professor of electrical and computer engineering, is developing algorithms and software for robotic manipulation, to improve robots’ interaction with humans. (Sean Flynn/UConn Photo)

Ultimately, the UConn scientists hope to develop software that will teach industrial robots how to use their sensory inputs to quickly “learn” the various steps for a manufacturing task – such as assembling a drawer or a circuit board – simply by watching their human counterparts do it first.

“We’re trying to move toward human intelligence,” says Dani, the lab’s director and a faculty member in the School of Engineering. “We’re still far from what we want to achieve, but we’re definitely making robots smarter.”

To further enhance robotic intelligence, the UConn team is also working on a series of complex algorithms that will serve as an artificial neural network for the machines, helping robots apply what they see and learn so they can one day assist humans at their jobs, such as assembling pieces of furniture or installing parts on a factory floor. If the process works as intended, these bots, in time, will know an assembly sequence so well, they will be able to anticipate their human partner’s needs and pick up the right tools without being asked – even if the tools are not in the same location as they were when the robots were trained.

This kind of futuristic human-robot interaction – called collaborative robotics – is transforming manufacturing. Industrial robots like the one in Dani’s lab already exist. Although currently, engineers must write intricate computer code for all of the robot’s individual movements or manually adjust the robot’s limbs at each step in a process to program it to perform. Teaching industrial robots to learn manufacturing techniques simply by observing could reduce to minutes a process that currently can take engineers days.

From left back row, Ph.D. students Iman Salehi, Harish Ravichandar, Kyle Hunte, Gang Yao, and seated, Ashwin Dani, assistant professor of electrical and computer engineering. (Sean Flynn/UConn Photo)

“Here at UConn, we’re developing algorithms that are designed to make robot programming easier and more adaptable,” says Dani. “We are essentially building software that allows a robot to watch these different steps and, through the algorithms we’ve developed, predict what will happen next. If the robot sees the first two or three steps, it can tell us what the next 10 steps are. At that point, it’s basically thinking on its own.”

In recognition of this transformative research, UConn’s Robotics and Controls Lab was recently chosen as one of 40 academic or academic-affiliated research labs supporting the U.S. government’s newly created Advanced Robotics for Manufacturing Institute or ARM. One of the collaborative’s primary goals is to advance robotics and artificial intelligence to maintain American manufacturing competitiveness in the global economy.

“There is a huge need for collaborative robotics in industry,” says Dani. “With advances in artificial intelligence, lots of major companies like United Technologies, Boeing, BMW, and many small and mid-size manufacturers, are moving in this direction.”

The United Technologies Research CenterUTC Aerospace Systems, and ABB US Corporate Research – a leading international supplier of industrial robots and robot software – are also representing Connecticut as part of the new ARM Institute. The institute is led by American Robotics Inc., a nonprofit associated with Carnegie Mellon University.

Connecticut’s and UConn’s contribution to the initiative will be targeted toward advancing robotics in the aerospace and shipbuilding industries, where intelligent, adaptable robots are more in demand because of the industries’ specialized needs.

Joining Dani on the ARM project are UConn Board of Trustees Distinguished Professor Krishna Pattipati, the University’s UTC Professor in Systems Engineering and an expert in smart manufacturing; and assistant professor Liang Zhang, an expert in production systems engineering.

“Robotics, with wide-ranging applications in manufacturing and defense, is a relatively new thrust area for the Department of Electrical and Computer Engineering,” says Rajeev Bansal, professor and head of UConn’s electrical and computer engineering department. “Interestingly, our first two faculty hires in the field received their doctorates in mechanical engineering, reflecting the interdisciplinary nature of robotics. With the establishment of the new national Advanced Robotics Manufacturing Institute, both UConn and the ECE department are poised to play a leadership role in this exciting field.”

The aerospace, automotive, and electronics industries are expected to represent 75 percent of all robots used in the country by 2025. One of the goals of the ARM initiative is to increase small manufacturers’ use of robots by 500 percent.

Industrial robots have come a long way since they were first introduced, says Dani, who has worked with some of the country’s leading researchers in learning and adoptive control, and robotics at the University of Florida (Warren Dixon) and the University of Illinois at Urbana-Champaign (Seth Hutchinson and Soon-Jo Chung). Many of the first factory robots were blind, rudimentary machines that were kept in cages and considered a potential danger to workers as their powerful hydraulic arms whipped back and forth on the assembly line.

Today’s advanced industrial robots are designed to be human-friendly. High-end cameras and elaborate motion sensors allow these robots to “see” and “sense” movement in their environment. Some manufacturers, like Boeing and BMW, already have robots and humans working side-by-side.

Of course, one of the biggest concerns within collaborative robotics is safety.

In response to those concerns, Dani’s team is developing algorithms that will allow industrial robots to quickly process what they see and adjust their movements accordingly when unexpected obstacles – like a human hand – get in their way.

“Traditional robots were very heavy, moved very fast, and were very dangerous,” says Dani. “They were made to do a very specific task, like pick up an object and move it from here to there. But with recent advances in artificial intelligence, machine learning, and improvements in cameras and sensors, working in close proximity with robots is becoming more and more possible.”

Dani acknowledges the obstacles in his field are formidable. Even with advanced optics, smart industrial robots need to be taught how to distinguish a metal rod from a flexible piece of wiring, and to understand the different physics inherent in each.

Movements that humans take for granted are huge engineering challenges in Dani’s lab. For instance: Inserting a metal rod into a pre-drilled hole is relatively easy. Knowing how to pick up a flexible cable and plug it into a receptacle is another challenge altogether. If the robot grabs the cable too far away from the plug, it will likely flex and bend. Even if the robot grabs the cable properly, it must not only bring the plug to the receptacle but also make sure the plug is oriented properly so it matches the receptacle precisely.

“Perception is always a challenging problem in robotics,” says Dani. “In artificial intelligence, we are essentially teaching the robot to process the different physical phenomena it observes, make sense out of what it sees, and then make the appropriate response.”

Research in UConn’s Robotics and Controls Lab is supported by funding from the U.S. Department of Defense and the UTC Institute of Advanced Systems Engineering. More detailed information about this research being conducted at UConn, including peer-reviewed article citations documenting the research, can be found here. Dani and graduate student Harish Ravichandar also have two patents pending on aspects of this research: “Early Prediction of an Intention of a User’s Actions,” Serial #15/659,827, and “Skill Transfer From a Person to a Robot,” Serial #15/659,881.

ECE Seminar Series: Building-to-Grid Control Framework for Grid Services

ECE title

ECE Seminar Series Fall 2017

Wednesday September 27th 2:30-3:30 PM, KNS 103

Building-to-Grid Control Framework for Grid Services

Sumit Paudyal

Michigan Technological University

Abstract: With the implementation of Smart Grid technologies, such as sensors, smart meters, smart appliances, more than one-fourth of the US total electricity demand could be dispatchable. Coordinated demand dispatch of customers’ loads provides benefits to the customers and the grid both. A complete demand dispatch solution that benefits the customers and the grid involves a large scale optimization problem with underlying complex transmission and distribution grid models. A centralized approach to solve this problem is computationally involving in a practical sized grid with the consideration of comprehensive customer load models and the grid models that include discrete control variables. A practical way to solve this problem is to use hierarchical and distributed computing approaches, where information exchange occurs between the different levels in the hierarchy. This talk presents hierarchical framework to i) optimally dispatch electric vehicle (EV) loads and ii) optimally dispatch commercial building loads in building-to-grid (B2G) interaction. The case studies demonstrate the benefits of optimal demand dispatch of EV and building loads to the customers and grid operations.


Short Bio
: Sumit Paudyal received B.E. in Electrical Engineering from Tribhuvan University in Nepal, in 2003; Msc. degree in Electrical Engineering from the University of Saskatchewan, Saskatoon, Canada, in 2008; and a Ph.D. in Electrical Engineering from the University of Waterloo, Ontario, Canada, in 2012. Currently, Dr. Paudyal is an Assistant Professor at Michigan Technological University. His research expertise includes Smart Distribution Grid Operations, Optimization Techniques in Power Systems, Power System Protection, and Power System Real-time Hardware Simulations.

Embedded System Competition Award

 

A UConn team of students competed in a MITRE-sponsored embedded systems security capture the flag competition this semester and got first place. The team was led by UG ECE students Brian Marquis and Patrick Dunham with grad student Chenglu Jin and two CSE UG students.