The Internet of Things (IoT) is changing the way we interact with our surrounding environment in domains as diverse as health, transportation, office buildings or our homes. In smart building environments, information captured about a building’s infrastructure and its inhabitants will help develop services that can help us become more productive, increase our comfort, enhance our social interactions, increase safety, save energy and more. But by relying on the collection and sharing of information about a building’s inhabitants and their activities, these services also open the door to privacy risks.
We are developing a privacy-aware IoT framework, which deals with two important problems with respect to privacy in smart spaces. The first is a mechanism and language for specification and management of privacy preferences of users that is flexible enough to represent complex requirements but without inducing privacy fatigue. The second is a data management system which is able to efficiently handle millions of these preferences at run time within reasonable bounds on the collected user data.
Error correction codes (ECC) are ubiquitously used in data storage to provide fault tolerance. Traditionally, redundancy and error tolerance capability are considered the key performance measures for ECC. In distributed storage, however, codes need to be developed for the new challenges brought by the networked nature of the system. In this talk, we present regenerating codes, a family of codes that are efficient for single failure correction. For systems with 2 redundant storage devices, the repair bandwidth can be reduced by 1/2 compared to traditional ECC. Next, we generalize regenerating codes to address the problem of recovering of multiple erasures simultaneously. The recovery strategy for multiple failures is similar to that for single failure, however certain extra specified requirements need to be satisfied.
DifferentialEquations.jl is a package for solving differential equations in Julia. It covers a wide variety of deterministic and stochastic differential equations, mixing discrete and continuous equations. Through extensive use of multiple dispatch, metaprogramming, plot recipes, foreign function interfaces (FFI), and call-overloading, DifferentialEquations.jl offers a unified user interface to solve and analyze various forms of differential equations while not sacrificing features or performance. Many modern features are integrated into the solvers, such as allowing arbitrary user-defined number systems for high-precision and unit-checked arithmetic, built-in multithreading and parallelism, and symbolic calculation of Jacobians. A package add-on system provides high-performance tools for parameter estimation, sensitivity analysis, and other higher order model analyses. Integrated into the package is an algorithm testing and benchmarking suite to both ensure accuracy and serve as an easy way for researchers to develop and distribute their own methods. Together, these features build a highly extendable suite which is feature-rich and highly performant.
We provide an efficient algorithm for determining how a road network has evolved over time, given two snapshot instances from different dates. To allow for such determinations across different databases and even against hand-drawn maps, we take a strictly topological approach in this paper, so that we compare road networks based strictly on graph-theoretic properties. Given two road networks of same region from two different dates, our approach allows one to match road network portions that remain intact and also point out added or removed portions. We analyze our algorithm both theoretically, showing that it runs in polynomial time for non degenerate road networks even though a related problem is NP-complete, and experimentally, using dated road networks from the TIGER/Line archive of the U.S. Census Bureau.
Digital activism tools give voice to grassroots movements. However, a recent proliferation in activist-specific digital messaging tools (DMTs) have depreciated the value of citizen communication to policymakers. DMTs provide easy-to-use communication systems for citizens to send messages to policymakers. In theory, when policymakers receive a large number of messages through DMTs, they should assume a large number of citizens are interested in that policy. However, this is not the case. In reality, these DMTs do not indicate citizen interest because they use automated content which are of little to no value to policymakers. In this presentation, I analyze DMTs role in political activism in the U.S., and describe how DMTs are paradoxically widening the communication gap between citizens and their policymakers. I discuss this gap created by DMTs in terms of a diffusion of unsuccessful innovation.
Faezeh Tork Ladani
Electrical Engineering and Computer Science
Photo-induced force microscopy (PiFM) is an emerging microscopy technique that is based on detecting the photo-induced force that a nano-object exerts on a sharp probing tip. PiFM is a near field detection technique that is sensitive to the chemical and physical properties of the sample. Hence, by analyzing the force signal we can extract chemical information of nano-objects with very high spatial resolution down to the ~5 nm scale. Near-field detection demands a higher complexity both in the setup and also interpretation of the experiments. But at the same time, it provides more accuracy and more information, which cannot be easily obtained in the far field.
We are developing the technique both experimentally and theoretically to open up new applications. We have reported a successful experimental demonstration of this technique in imaging of surface plasmon polariton (SPP). The results clearly showed that the PiFM technique is capable of mapping propagating SPP modes on flat surfaces. The PiFM method is also promising for the real space visualization of other propagating surface modes, such as plasmons and the evanescent fields of semiconductor waveguides in photonic circuits.
Electrical Engineering and Computer Science (EECS)
Wireless communications systems work like human conversations. When 2 people talk to each other, one person talks while the other listens. In communications engineering, this is called “half-duplex.” The 2 people cannot simultaneously talk because each one of them hears his or her own voice much louder than the other person’s voice. If the 2 people could talk simultaneously, conversations can go twice as fast as the way we normally talk. I investigate how to double communications rates by designing “full-duplex” wireless systems, which can simultaneously transmit and receive. Advancements in wireless technology continuously produce new wireless applications. Increasing communications speed through my current research in full-duplex systems is essential to satisfy the demand for more wireless connections and higher communications traffic.
IMAM UZ ZAMAN
Electrical Engineering and Computer Science
We are developing an omnidirectional Inter-Satellite Optical Communicator (ISOC) that will enable up to 1 Gbps data rates over a distance up to 200 km in free space. In particular, this effort tackles space, power and performance challenges pertinent to CubeSat missions. The ISOC features a dodecahedron geometry that can fit inside a CubeSat and utilizes avalanche photodetector arrays and gimbal-less MEMS scanning mirrors for beam steering. The development of an omnidirectional antenna with spherical coverage will, in principle, be capable of solving the pointing issue, typical of optical communications.The architecture of the proposed omnidirectional inter-satellite communicator consists of a compact dodecahedron array of eleven optical transceivers, each transceiver furnished with an InGaAs avalanche photodiode, an optical circulator and a high power laser diode. We are developing two omnidirectional communicator prototypes capable of full-duplex operation. Due to its advanced design, the ISOC communicator should allow multiple simultaneous links among several spacecraft. We plan to use advanced, efficient lightweight single-mode laser diodes operating at 1550 nm.
Physics and Astronomy
Still in its pilot stage, the ARIANNA array is honing the technology needed to detect some of the highest energy particles in The Universe. By using Antarctica’s Ross Ice Shelf as a detector, we aim to detect cosmic neutrinos with energy in excess of 10^18 eV, higher than any neutrinos detected so far, and many orders of magnitude above what we can achieve with man-made accelerators. By using low cost radio detectors we can survey hundreds of cubic kilometers of ice, which is what it’ll take to see these extremely rare and weakly interacting particles. We’ll discuss how we hope to make this detection, the unique things these neutrinos can tell us about our universe, and a bit about what it’s like to do physics in one of the most inhospitable environments on the planet.
Physics and Astronomy
Mother nature has gifted us with massive telescopes that would enable us to look farther back into the history of our universe. These natural telescopes are known as gravitational lenses and finding them among millions of galaxies is not so easy. But what they can reveal about the evolution of our universe, encourages us to search for them. To do so, we have developed and have employed an artificial intelligence algorithm to search for these lenses. In this talk, I will explain how gravitational lenses work and how we use artificial neural networks to find them.
Physics and Astronomy
Nuclear fusion energy is always perpetually 50 years away. The pursuit of that dream has until recently required massive facilities costing billions of dollars due to the immense technical challenges in sustainably fusing two small atoms together. In the past decade we have begun using a method called thermal density functional theory (DFT) to calculate the attributes of incredibly hot substances like the cores of gas giants, explosions, and fusion reactions. However, thermal DFT is imperfect and in my research I work to improve the theory so that it can be more accurate. Improved accuracy leads to a better understanding of the physics, and that can lead to improved technology at a fraction of the cost.
Thomas E. Baker
Department of Physics & Astronomy
Common materials–such as metals–can resemble the properties of a fundamental particle physics system. For example, evidence for Higgs bosons was first found in the 1980s from the theory of superconductivity. Superconductors are metals that have no resistance at low temperatures. Making a superconductor is easy in comparison with building the Large Hadron Collider where the Higgs boson was found recently.
My work shows that neutrinos–among the lightest and hardest to measure particles in our universe–can be understood from the theory of superconductors that are placed near magnets. Neutrinos are known to oscillate between three types called flavors. These three flavors correspond to superconducting states, but there is also a fourth state in the superconductor controlling flavor oscillations. This provides evidence for a sterile neutrino. I review the quantum mechanical analogy between the two systems and nanoscale superconducting transport theory; the behavior of neutrino oscillations can be understood from this.
Proteins are tiny molecular machines that are involved with nearly all cellular functions, with diverse roles such as structural support and cellular signaling. Malfunction of individual proteins results in diseases, such as cancer, high blood pressure, and Alzheimer’s disease. Consequently, most therapeutic drugs are small molecules that bind to proteins to modify their actions. Drug discovery is a long and costly process, on average taking at least ten years and up to $2.6 billion for each successful drug. Computational chemistry methods can provide a “shortcut” in the drug discovery pipeline by providing insights to key molecular interactions on what could make a good drug for some specific protein. My work focuses on the human protein Hv1, a voltage-gated proton channel whose function is to remove excess protons from inside the cell. This protein is implicated in a number of maladies, including the metastasis of breast and colon tumor cells as well as a substantial contribution to brain damage in ischemic stroke. The objective of my work is to understand, improve, and predict small-molecule binding of Hv1, which can lead to clinically useful therapeutics.
The diffraction limit is the well-known physical limit to the size of a focused beam of light, preventing optical microscopy from achieving spatial resolutions better than ~300 nm in the visible spectrum. In order use optical spectroscopy to study chemistry at the scale of an individual molecule, this barrier has to be broken. Making use of plasmonics, specially designed nano-structures accomplish just that, confining light to smaller and smaller volumes. Through the use of a carefully fabricated nano-antenna, we can now focus light down to the atomic scale. Tip-enhanced Raman spectroscopy (TERS) can then be used to image a single molecule based off of its optical spectrum. This sub-nanometer laser spot allows us to observe and initiate chemistry with unprecedented spatial resolution.
This presentation will focus on two separate topics: an econometric analysis of transportation noise near Los Angeles International Airport (LAX) and the benefits for the utilization of spatial models to better estimate geographically represented analyses. Surprisingly, while considerable research has been conducted on transportation-related airborne pollution and its effects, there has been minimal study in the U.S. on transportation noise-related economic and health impacts.
This spatial hedonic analysis focuses on the neighborhoods adjacent to LAX, using single family housing prices as a proxy to estimate the dollar cost of noise generated by airport activity. The use of spatial model parameter values is then compared to commonly used ordinary least squares (OLS) regression estimates, demonstrating the tendency for OLS to generate over-biased parameters.
This talk presents a subsection of my research on ridesharing technology and its impact of driving as work in India. In this talk I argue that cities and people, not technology are the real heroes of the ridesharing app story by showing how the experience of ridesharing (from hailing to completing a ride) needs to be studied as an urban phenomenon taking into account factors like traffic, class, gender, caste and labor conditions. By calling for a contextual study of ridesharing apps, I show how infrastructure and labor are not stable and passive but in fact need constant attention to make technology successful.
Planning, Policy, and Design
High-quality parks provide numerous health benefits and have few barriers to access. However, underrepresented neighborhoods have fewer high-quality parks, compared to white and affluent neighborhoods. This unequal distribution may even occur in cities reputed to have high-quality parks. It is this unequal distribution that renders parks an issue of Environmental Justice (EJ). I hypothesize that in order for parks in underrepresented neighborhoods to be of high-quality, decision-makers must offer underrepresented groups and public and private partners opportunities for collaboration with park development and operations, related to providing health benefits and addressing issues of access. I present the first portion of my dissertation project, data on park quality and EJ in terms of park distribution examined for a large sample of neighborhoods. I use ArcGIS to conduct a spatial analysis, assemble 2010 US Census data on demographic variables to represent each census tract’s potential for environmental injustice, and finally, OLS regression analysis to estimate the relationship between park quality and demographic variables.
Matin Rahnamay Naeini
Civil and Environmental Engineering
The increasing number of optimization methods, make testing and validation of the algorithms impractical for different problem space. However, different optimization methods have strengths and limitations, and the performances of individual algorithm obey the “No Free Lunch” theorem. The variability of these capabilities, cause challenges and difficulties for users to select a suitable optimization algorithm for a particular problem. In this research we introduce a new generic, hybrid framework for optimization, entitled Shuffled Complex-Self Adaptive Hybrid EvoLution (SC-SAHEL) algorithm that employs multiple evolutionary algorithms as search cores in a parallel platform. The advantage of the newly developed SC-SAHEL algorithm is the intelligence of selecting best performing algorithms during the search, and the effectiveness of finding global optimum that over-performs single search mechanism. Besides, SC-SAHEL provides information regarding the performance of each of the evolutionary algorithms at each optimization step and gives a general overview of evolutionary algorithms performance.
Electricity comes from many different sources. There are renewable energies, such as solar and wind; there are fossil fuel based energies, such as gasoline and natural gas; and there are energy storage systems, such as batteries and potentially hydrogen. It’s important that as California attempts to achieve higher renewable penetration it is well understood the effects of new technologies on the electric grid. For example, when the sun is setting, people are heading home and start using a lot of electricity. Since the sun is down, solar power is no longer available. Wind is difficult to predict, so it’s uncertain whether wind power could meet the demand. Therefore, Peaker Power Plants need to be utilized, which emit greenhouse gases (GHGs) and pollutants. These peaker power plants will end up increasing the GHG and pollutant emissions in order to produce electricity to meet the demand. This problem could be mitigated with energy storage, through technologies such as batteries or hydrogen. Only with coupling of renewable power and energy storage will California be able to achieve high renewable penetration and low GHG and pollutant emissions.