Mention "robots" and "morality" together and many people think of the fiction of Isaac Asimov. His "Three Laws of Robotics" (later expanded to four laws) are a great plot device, serving as a hierarchical ethical code for the robots in his stories: first, never harm a human being through action or inaction; second, obey human orders; last, protect oneself. From the first story in which the laws appeared, Asimov explored their inherent contradictions. It's great fiction, but unworkable theory. Meanwhile, from battlefields to open highways, machines are increasingly operating in the same physical spaces as humans and with more and more independence from human oversight. As Rosalind Picard of MIT has put it, "The greater the freedom of a machine, the more it will need moral standards." Science fiction aside, the prospect of machines capable of following moral principles, let alone understanding them, seems very remote. Nevertheless, philosophers, computer scientists, and engineers have begun reflecting seriously on the prospects for developing computer systems and robots capable of making moral decisions and a new field of inquiry directed at the development of artificial moral agents has begun to emerge. Where is it all heading?
The lecture first will summarize and highlight my former scientific research achievements in chronological order, paving the road to my recent research activity. These includes theoretical understanding on relativistic electron scattering on standing electromagnetic waves, computer modeling of lasers and laser resonators, building state of the art table top optical and spectroscopy systems, and applied research on ultrafast laser spectroscopy of photoactive proteins. The second part of the lecture will introduce our recent progress on modeling, construction, and characterization of picosecond distributed feedback dye lasers. The last section of the lecture will highlight a novel data analyses method applied for coherent Raman spectroscopy and my new results and future discoveries on this field.
Despite the significant attention being given to the critical problems of cyber security, the ability to keep up with the increasing volume and sophistication of network attacks is seriously lagging. Throwing more computing horsepower at fundamentally-limited visualization and analytic approaches will not get us anywhere. Instead, we need to seriously rethink the way cyber security tools and approaches have been conceived, developed, and deployed. IHMC is taking advantage of the combined strengths of humans and software agents to create new capabilities for Network Operations Centers (NOCs). These capabilities are being implemented in a new cyber defense framework called Sol. Our objective is to enable distributed sensemaking, rapid detection of threats, and effective protection of critical resources. Specifically, we use agents, policies, and visualization to enact coactive emergence as a sensemaking strategy for taskwork and teamwork, and we implement capabilities for organic resilience and semantically-rich policy governance as a means of assuring effective and adaptive human-agent team response. IHMC has applied its long years of experience with software agents to the design of a new agent framework called Luna. Luna agents function both as interactive assistants to analysts and as continuously-running background aids to data processing and knowledge discovery. Luna agents achieve much of their power through built-in teamwork capabilities that, in conjunction with IHMC's KAoS policy services framework, allow them to be proactive, collaborative, observable, and directable. In order to support dynamic scalability and other features of the Sol framework, the Luna platform supports the policy-governed option of allowing the state of agents (vs. code of agents) to migrate between operating environments and hosts.
As one moves to studying quantum systems and how they behave, it quickly becomes apparent that a completely different way of thinking and visualizing problems to be solved is required. Uniquely quantum effects are unintuitive, and often difficult if not impossible to directly translate into a classical algorithm. A classical computer must be explicitly told how to implement even the simplest of quantum effects seen in nature. Quantum computers on the other hand have the ability to intrinsically work with these unusual effects, and can be guided in processing enormous volumes of information in very novel ways. There is a tradeoff though, efficient quantum algorithms are notoriously hard to design. They require the designer to speak and think in the quantum language of nature. However, this is what it means to be a quantum programmer, learning how to guide nature using a language it understands, and in return gaining access to the incredible set of tools that nature has at its disposal.
Shock Boundary Layer Interaction (SBLI): The interaction of a shock wave with a boundary layer on the surface of a body has been recognized as important aerodynamic phenomena for decades. Most studies have focused on two dimensional flows involving a compression corner or SBLI of a flat plate. The research involving the SBLI and a body of revolution such as an Ogive cylinder has not been thoroughly documented. The extent and determination of the flow separation is driven by a number of factors including: the strength of the incident shock wave resulting in significant pressure gradients within the boundary layer, the turbulent nature of the boundary layer just upstream of the interaction site and the azimuthal angle of the interaction. Clearly, if these interactions are not well understood and accounted for they can have catastrophic consequences on the flow field. Boundary layer behavior is mainly dependent on the Reynolds number, while the shock wave behavior depends primarily on the Mach number, so neither can be neglected. Additionally, the pressure gradient imposed by the shock wave drives the thickening or separation of the boundary layer. Thickening of the boundary layer induces a shock wave that interacts with the incident shock wave providing a feedback mechanism that could induce unsteadiness in the flow field. Because of these complexities, simple geometries and SBLI still remains a subject of much research.
Reduced Order Modeling with Optimized Training Maneuvers: The ability to determine stability and control characteristics rapidly is an important element in the design of new flight vehicles (aircraft or weapons) and similarly important for determining flight clearances of fielded aircraft. The objectives of this research are to research optimal reduced order modeling techniques such as proper orthogonal decomposition and surrogate based modeling for the rapid prediction of aerodynamic force, moments and their derivatives and surface loads. The research effort will develop a reduced-order model capable of unsteady aerodynamic modeling. The approach will be suitable for analyzing aerodynamic characteristics such as chemically reacting flows or massively separated flows that are subject to prescribed maneuvers or varying freestream conditions. The work will focus on a hybrid a priori and posterior analysis for increasing accuracy and filling the required sample space. Furthermore, identification of the optimal training maneuver and automation in the design thereof will be fully investigated.
Real-Time Image Processing on an FPGA Platform: The purpose of this effort is to design and demonstrate a proof-of-concept real-time image processing FPGA based system for an Unmanned Aerial Vehicle. The effort entails developing and integrating a sensor interface to interact with the image processing system. Currently, this work is done on traditional PC based systems which are slow and require power consumption rates above desirable targets. Utilizing reconfigurable computing, it is expected the required design footprint will be minimal with a drastic increase in both performance and power consumption rates. The effort will use off the shelf COTS technology.
Leading Edge Vortex Development on a Pitching Flat Plate at Low Reynolds Number: Many natural flyers utilize unsteady flow phenomena to enhance their flight characteristics in low Reynolds number regimes. Understanding the development and evolution of these structures is essential to understanding their influence on unsteady aerodynamics. This research investigates the leading edge vortex (LEV) development over a flat plate with an aspect ratio of 4 which is driven through a pitch up maneuver. Two-component and three-component Particle Image Velocimetry (PIV) is utilized to measure the LEV and tip vortex (TiV) respectively. The Line Integral Convolution (LIC) and vortex core identification techniques are utilized on the flow fields to assess these unsteady structures throughout the kinematic motion.
The measure of physical activity in chronic disease populations is becoming more and more important for researchers and clinical teams. Objective measures help to augment patient self-reports and provide substantial proof that an intervention is either effective or ineffective. Since 1988, ActiGraph has specialized in developing robust, ambulatory actigraphy monitors to accurately quantify activity levels in various populations. Our devices have been used by some 1,500 institutions in thousands of research projects in 65 different countries making them the world's most validated option for quantifying energy expenditure in humans. Our latest efforts take advantage of low-power wireless protocols and the availability of smartphones to collect and display this data to care teams in near real-time. By expediting the data collection process, we aim to reduce the cost of drug trials by exposing the effectiveness of an intervention sooner. This seminar will discuss the technical and economical challenges inherent in developing this solution at ActiGraph.
Biogeochemical cycles on Earth are heavily dependent on enzymatic reactions catalyzed by microscopic life. Subseafloor microbial abundance is estimated to be almost 10^30 cells. A large subset of this biomass is comprised of microbial life only somewhat related to known types and likely includes not yet discovered natural products as well as gene products vital to ecosystem function. Technological advances and collaborations are certainly needed for more comprehensive studies of genetic, genomic, and phenotypic richness of microorganisms in natural habitats. The Gulf of Mexico oil spill in 2010 highlighted a lack of understanding of even basic processes of the northern Gulf of Mexico ecosystem. This seminar will cover ongoing investigations of bacteria, archaea, and microozooplankton, which are key elements of productivity in the Florida Panhandle Bight Shelf and head of DeSoto Canyon. We are monitoring these microorganisms in a study aimed at addressing spatial and temporal stability and change in the water column and benthos. Data collection includes primary productivity, hydrograhic and water nutrient chemistry data, and DNA sequences. The findings add to what is known about Gulf of Mexico microbial loop structure and function and suggest future multidisciplinary possibilities.
Two-dimensional systems offer a rich array of physical phenomena that include the integer and fractional quantum Hall effects, both of which have been observed in multiple materials systems to date. The mitigation and control of coherence in quantum states in 2D systems is an area of great current interest that is critical for the development of the next generation of solid state electronics based on quantum phenomena. In our experiments, we investigate the terahertz frequency properties of a high mobility (μ ≥ 106 cm2 V-1 s-1) gallium arsenide two-dimensional electron gas (2DEG) at cyclotron resonance in a perpendicular magnetic field, which results in the formation of a spectrum of Landau levels. We use a picosecond ultrafast terahertz pulse to create a coherent superposition between the highest filled and lowest unfilled Landau level as a two level system. We can measure dephasing of the cyclotron ensemble as a function of temperature and time. By using phase-sensitive ultrafast terahertz measurement techniques, we overcome traditional limitations that have prevented accurate spectroscopic studies of dephasing in these high-mobility samples. This has been a critical limitation since these high-mobility samples are expected to have the long decoherence lifetimes, τ, that would be needed for device applications. Our experiments reveal a strong increase in the decoherence at low temperatures and a power law dependence to the decoherence time from T = 0.4 - 100 K. Finally, we demonstrate preliminary control of this two level system on a picosecond time scale by using multiple terahertz pulse. With these, we can manipulate the phase of the coherent superposition to demonstrate coherent control, very important for any future quantum computation scheme based on 2DEG's and not easily accessible through alternate techniques.
Engineering systems are becoming increasingly more complex. Thus, it is highly desirable that engineers have skills for the analysis, synthesis, and design of such complex systems. A design in general transforms specifications into practical systems that satisfy those specifications. The design process requires synthesis. It involves many variables, and is challenging. It develops critical thinking, creativity and innovation. The ABET accreditation criteria include a capstone design requirement. This presentation describes the ABET Criterion 3 - Student Outcomes for engineering and computing programs, and their linkages to a culminating capstone design experience. The implementation and assessments of the capstone design experience for engineering programs at UWF are also presented.
The USAF (AFRL/RW) and the Army (AMRDEC) partnered with Japan's Technical Research and Development Institute (TRDI) to develop, test, build, and evaluate image gyro technology tailored for airborne applications. The goal is to enable precise and robust 3-D motion estimation by simultaneously using multiple different-direction images and enabling global positioning by matching pre-captured Geo-registered satellite or aerial imagery in GPS denied environments. Moreover, the teams designed and built an exploratory system in order to demonstrate the feasibility for airborne applications using real data. Both teams - the US and Japan - developed Image Gyro software to determine the ego-motion of a multi-sensor platform by processing data from the passive on-board sensors, including GPS, the latter used for evaluation purposes only. However, both teams took fundamentally different approaches. The US' approach made use of an advanced fusion engine developed with Northrop Grumman under the Optical Flow For Enhanced Navigation and Sensor Exploitation (OFFENSE) program, which adaptively fuses in real time, optical flow with all available navigation data - including that derived from INU/GPS inputs, altimeters, star tracker devices, passive imaging sensors, and digital elevation databases. However, in previous studies using simulated data, we found that an optical flow approach coupled only with inertial measurements still led to a solution drift. Further studies at AFRL/RW showed that a global positional fix was needed at regular intervals to eliminate this drift. Thus, the AFRL/RW, in collaboration with the Army's AMRDEC team, developed a geo-referencing algorithm that integrates seamlessly into OFFENSE that geo-registers captured imagery with a pre-fetched database using Google Earth. Recently, the US adapted the Japanese design for the sensor suite and built a flight test unit for the purpose of collecting real data for feasibility demonstrations of the Image Gyro system. The sensor suite consists of three wide-field-of-view camera heads, two long-wave un-cooled IR cameras, COTS accelerometers, gyros, pressure, and temperature sensors. The US and Japan successfully concluded the first joint flight test earlier this year and are now in the process of analyzing the collected data and evaluating the results of the their respective Image Gyro software.