Active Subspace and Surrogate Model Techniques for Complex Physical and Biological Models

Friday, October 21, 2016 - 02:20 pm
Swearingen 2A31
Abstract: For many complex physical and biological models, the computational cost of high-fidelity simulation codes precludes their direct use for Bayesian model calibration and uncertainty propagation. Furthermore, the models often have tens to thousands of inputs--comprised of parameters, initial conditions, or boundary conditions--many of which are unidentifiable in the sense that they cannot be uniquely determined using measured responses. In this presentation, we will discuss techniques to isolate influential inputs and employ surrogate models when computational budgets are limited. For input selection, we will discuss the use of global sensitivity analysis methods to isolate influential inputs and active subspace construction for linearly related parameters. We will also discuss the manner in which Bayesian calibration on active subspaces can be used to quantify uncertainties in physical parameters. These techniques will be illustrated for models arising in nuclear power plant design and HIV characterization and treatment. Biosketch: Ralph Smith received his PhD in Applied Mathematics from Montana State University in 1990. Following a three-year postdoctoral position at the Institute for Computer Applications in Science and Engineering (ICASE) at NASA Langley Research Center, he was an Assistant Professor in the Department of Mathematics at Iowa State University. He joined the North Carolina State University faculty in 1998, where he is presently a Distinguished Professor of Mathematics. He is Editor-in-Chief of the SIAM book series on Advances in Design and Control and is on the editorial boards of the SIAM/ASA Journal on Uncertainty Quantification and the Journal of Intelligent Material Systems and Structures. He is co-author of the research monograph Smart Material Structures: Modeling, Estimation and Control and author of the books Smart Material Systems: Model Development and Uncertainty Quantification: Theory, Implementation, and Applications. His research areas include mathematical modeling of smart material systems, numerical analysis and methods for physical systems, Bayesian model calibration, sensitivity analysis, control, and uncertainty quantification.

Google Talk

Wednesday, October 5, 2016 - 05:00 pm
Amoco Hall on the 1st floor of Swearingen.
Google representatives will be hosting a presentation on opportunities at Google for interested students on Wednesday, October 5th at 5:00PM in Amoco Hall on the 1st floor of Swearingen.

Phishing

Friday, September 23, 2016 - 05:30 pm
Amoco Hall

Why Does Every Tech Company Care about Patents? (Or, Why Does My Manager Keep Bugging Me about Patents?)

Tuesday, September 13, 2016 - 05:45 pm
SWGN 2A31
Virtually every technology company has a patent portfolio. Whether they’re in the news for suing another company for patent infringement, being sued themselves (often by “patent trolls”), or selling their patent portfolio for billions of dollars, patents and technology are inextricably linked. Why is this the case, and what is a patent, anyway? If patents are so important to these companies, what should I know about patents prior to getting a job in computers? Does my CS or EE degree qualify me to get a job in the patent field? Come find out the answers to these questions, and ask your own. Stephen Shaw, who received a CS degree ‘03 and also has a law degree ‘97 from USC will present a colloquium Tuesday September 13 @5:45-6:45 in SWGN 2A31. Stephen currently lives in San Francisco and is working for a large firm there. This will be sponsored by the Software Engineering SIG of our student ACM chapter.

Computerized training of single-word production in speakers with aphasia or apraxia of speech

Tuesday, September 13, 2016 - 12:00 pm
Discovery 1, 915 Greene Street, Room 140
TecHealth: A South Carolina SmartState Center at USC Arnold School of Public Health Invites you to TECH TUESDAY TALKS Date: Tuesday, September 13 Time: Noon Place: Discovery 1, 915 Greene Street, Room 140* Speaker: Dirk den Ouden, Ph.D. Associate Professor, Dirk B. den Ouden is the Director of the Neurolinguistics Laboratory in the Communication Sciences and Disorders Department in the Arnold School of Public Health. In collaboration with Drs. Jijun Tang and Jeremiah Shepherd from the Dept. of Computer Engineering, the lab has developed a game-based computer application that allows speakers with aphasia to train speech output autonomously, making use of rhythmic cueing, instant feedback, and the benefits of overlearning to neural plasticity. Tech Tuesday Talks is a free, monthly seminar series bringing together researchers from across the USC campus who share interest in technology-assisted health promotion and disease prevention interventions and research. The series presents a forum to learn about one another’s work, spark collaborations, as well as to introduce students to the ongoing research conducted on the USC campus which incorporates technology in health promotion. All interested faculty, staff, students and the general public are invited to attend. Tech Tuesday Talks are presented on the second Tuesday of the month in Discovery I Building, Room 140 at noon. To learn more, visit http://techealth.sc.edu/tech-tuesday-talks or contact magradey@mailbox.sc.edu.

Proper Orthogonal Decomposition Reduced-Order Modeling of Complex Fluid Flows

Friday, September 9, 2016 - 02:20 pm
Speaker: Zhu Wang Affiliation: Mathematics, USC Location: SWGN 2A14 Time: 2:20 - 3:10 PM Abstract: In many scientific and engineering applications of complex fluid flows, computational efficiency is of paramount importance. However, because of the requisite of repeated numerical simulations in applications such as control, optimization, data assimilation, and uncertainty quantification, using the original system becomes prohibitive. Therefore, model reduction techniques have been frequently used by engineers and researchers. Among them, proper orthogonal decomposition is one of the most commonly used methods to generate reduced-order models for turbulent flows dominated by coherent structures. To achieve a balance between the low computational cost required by a reduced-order model and the complexity of the target turbulent flows, appropriate closure modeling strategies need to be employed. In this talk, we present reduced-order modeling strategies synthesizing ideas originating from proper orthogonal decomposition and large eddy simulation, develop rigorous error estimates and design efficient algorithms for the new reduced-order models. Bio: Dr. Wang is an assistant professor in the Department of Mathematics at University of South Carolina. He earned his PhD in Mathematics from Virginia Tech in 2012, and he was an industrial postdoc of the IMA at University of Minnesota, Twin Cities in 2012-2014. Dr. Wang research interests are scientific computing, numerical analysis, reduced-order modeling, climate modeling, large eddy simulation, and numerical solutions to PDEs.

Application of Game Theory to High Assurance Cloud Computing

Friday, September 2, 2016 - 02:20 pm
Swearingen 2A14
COLLOQUIUM Department of Computer Science and Engineering University of South Carolina Charles A. Kamhoua Date: September 02, 2016 Time: 1420-1510 (2:20-3:10pm) Place: Swearingen 2A14 Abstract The growth of cloud computing has spurred many entities, both small and large, to use cloud services for cost savings. Public cloud computing has allowed for quick, dynamic scalability without many overhead or long-term commitments. However, concern over cyber security is the main reason many large organizations with sensitive information such as the Department of Defense have been reluctant to join a public cloud. This is due to three challenging problems. First, the current cloud infrastructures lack provable trustworthiness. Integrating Trusted Computing (TC) technologies with cloud infrastructure shows a promising method for verifying the cloud’s behaviors, which may in turn facilitate provable trustworthiness. Second, public clouds have the inherent and unknown danger stemming from a shared platform - namely, the hypervisor. An attacker that subverts a virtual machine (VM) and then goes on to compromise the hypervisor can readily compromise all virtual machines on that hypervisor. We propose a security-aware virtual machine placement scheme in the cloud. Third, a sophisticated attack in a cloud has to be understood as a sequence of events that calls for the detection/response model to encompass observations from varying dimensions. We discuss a method to automatically determine the best response, given the observations on the system states from a set of monitors. Game theory provides a rich mathematical tool to analyze conflict within strategic interactions and thereby gain a deeper understanding of cloud security issues. Theoretical constructs or mathematical abstractions provide a rigorous scientific basis for cyber security because they allow for reasoning quantitatively about cyber-attacks. This talk will address the three cloud security challenging problems identified above and report on our latest findings from this body of work. Charles A. Kamhoua received the BS in electronic from the University of Douala (ENSET), Cameroon, in 1999, and the MS in telecommunication and networking and the PhD in electrical engineering from Florida International University (FIU), in 2008 and 2011, respectively. In 2011, he joined the Cyber Assurance Branch of the U.S. Air Force Research Laboratory (AFRL), Rome, New York, as a National Academies Postdoctoral Fellow and became a Research Electronics Engineer in 2012. Prior to joining AFRL, he was an educator for more than 10 years. His current research interests include the application of game theory to cyber security, survivability, cloud computing, hardware Trojan, online social network, wireless communication and cyber threat information sharing. He has more than 60 technical publications in prestigious journals and International conferences along with a Best Paper Award at the 2013 IEEE FOSINTSI. He has mentored more than 40 young scholars at AFRL counting Summer Faculty Fellow, postdoc, and students. He has been invited to more than 30 keynote and distinguished speeches in the USA and abroad. He has been recognized for his scholarship and leadership with numerous prestigious awards including 30 Air Force Notable Achievement Awards, the 2016 FIU Charles E. Perry Young Alumni Visionary Award, the 2015 AFOSR Windows on the World Visiting Research Fellowship at Oxford University, UK, an AFOSR Basic Research Award, the 2015 Black Engineer of the Year Award (BEYA), the 2015 NSBE Golden Torch Award—Pioneer of the Year, selection to the 2015 Heidelberg Laureate Forum, and the 2011 NSF PIRE Award at the Fluminense Federal University, Brazil. He is currently an advisor for the National Research Council, a member of ACM, the FIU alumni association, NSBE and a senior member of IEEE.

A Proposed Numerical Data Standard Supporting Automated Network Cluster Analytics

Friday, August 19, 2016 - 02:20 pm
SWGN 2A14
I would like to invite you to Dr. Joseph Johnson’s talk as part of CSCE 791 - Seminar on Advances in Computing. The seminar is open to anyone who is interested, not just students enrolled in the CSCE 791 class. Speaker: Joseph E. Johnson, PhD Affiliation: Physics, USC Location: SWGN 2A14 Time: 2:20 - 3:10 PM Abstract: A standard is proposed for all numeric data that tightly integrates (1) each numerical value with (2) its units, (3) accuracy (uncertainty) level, and (4) defining metadata into a new object called a MetaNumber (MN) with full mathematical processing of dimensional and error analysis along with full management of associated defining metadata tags. This lays a foundation for fully automated processing by intelligent agents. This MN standard has been designed, programed, and is now operational on a server in a Python environment as a multiuser cloud application using any internet linked device. Both transactional computations and API calls are supported. All numeric data is easily readable by both humans and computers and every data value has a unique name which can serve as its variable name in computation. Two examples are then explored of how such a data standard can support new AI directions and Big Data applications with: (1) automated cluster analysis of the associated derived networks using our theorems based upon Markov type Lie algebras and groups and (2) with additional cluster analysis, the tracking of computational processes identifying the underlying mathematical structures, core constants, component data, and user models. The MN design creates a network of all linked clusters of numerical information and computational processes providing a new vision of our “numerical universe”. The system has extensive applications to business, scientific, and industrial processing with fully automated data exchange. Bio: Dr. Johnson is a Distinguished Professor Emeritus in the Department of Physics at University of South Carolina. Dr. Johnson's primary research interest is theoretical physics and information theory with specialization in the foundations of relativistic quantum theory utilizing Lie Algebras where his initial work developed a new formulation of relativistic position operators thus generalizing the Poincare group. Later he found a new method of decomposing the Lie group and algebra for the most general linear transformation group in n dimensions into a scaling algebra and an n(n-1)-dimensional Markov type Lie algebra. This latter algebra, when restricted using a particular Lie basis, generates all possible continuous Markov transformations. This is instrumental in the study of entropy, information theory, and diffusion. One of his most important discoveries was that the Markov algebra is exactly isomorphic to all possible networks. This now allows the power of Lie groups and algebras to link to the theory of Markov transformations, and likewise to the full theory of networks and their classifications. He has developed an expansion of an arbitrary network as a series of Renyi’ entropy metrics with decreasing term importance and full network information similar to a Fourier expansion. His USC R&D team (the Advanced Solutions Group – www.asg.sc.edu ) developed advanced software systems for which he was the sole PI for over 120 grants for $14M between 1992 to 2012 to USC. His funding by DARPA, with $2.4M in 2004-2007, funded investigations in Markov entropy metrics and clustering for analyzing networks. Currently his work concentrates on (1) the proposed numerical metadata system www.metanumber.com, (2) the QRECT classroom system that uses advanced expert algorithms for self-correcting systems to determine optimal responses, (3) the mathematical foundations of networks and cluster analysis, and (4) a proposed methodology for the integration of general relativity with quantum theory (May 5, 2016 Colloquium in Physics). He just presented a paper titled: “Clustering and Network Analysis as a Data Analytic Tool” at the American Physical Society national annual meeting in Salt Lake City Utah. He currently is the PI for three active grants: Aspire 1, Aspire II, and SC Floods.