Lorem ipsum dolor sit amet, consectetur adipisicing elit. Enim delectus impedit ad a quisquam unde optio, asperiores soluta, nisi omnis dolorem perspiciatis iste accusamus voluptatum itaque ipsum at, eos ex! Lorem ipsum dolor sit amet, consectetur adipisicing elit. Obcaecati, unde minus quidem! Hic at aspernatur quibusdam numquam assumenda ducimus eveniet, corporis, placeat labore, accusantium nemo quia quaerat natus minus ipsa?
Leading by example in promoting workforce diversity, Purdue is actively working to advance and support women in high-performance computing through a networking group for female faculty, staff and students interested in diversity in HPC.In 2018, Purdue WHPC affiliated with the international Women in HPC (WHPC) organization as one of their first chapters to work together to achieve their common mission and goals of promoting diversity and inclusion in the technical workforce. At SC18, Purdue had an all-women SCC team made up of undergraduate WHPC members.
Purdue’s Scholar cluster is open to Purdue instructors from any field who want to use HPC in the classroom. Recently Scholar was updated to include interactive capabilities and increased storage (with funding from statistics professor Mark Daniel Ward) and GPU nodes. Scholar is currently being used in more than 80 classes. Learn more: https://bit.ly/357qGoF, https://bit.ly/378lKRZ.
Research Computing offers internships to many students across a variety of disciplines. Undergraduate and graduate students have worked with the Envision Center to build virtual reality simulations to assist with research and education, with the Scientific Solutions Group and HUBzero to develop cyberinfrastructure to support scientific discovery and with the research services and support team to support high-performance computing users. Learn more: https://bit.ly/2QjjQYJ, https://bit.ly/2XfL3gy.
Purdue’s Emerging IT Professionals program is a three-year, rotational program for recent graduates who are interested in learning more about different IT fields. Participants complete two 18-month rotations in areas such as Research Computing, Security and Policy and Infrastructure Services, and also receive professional development and funding for a graduate degree.
Thanks to a partnership between Purdue and Universidad EAFIT in Medellín, Colombia, hardware from retired Purdue clusters has found new life as EAFIT’s first research supercomputer, Apolo, which has been used to study tropical disease, seismic engineering and quantum mechanics, among other things. The schools have also sent joint teams to student cluster competitions, and this year two students from EAFIT are working as interns for Purdue Research Computing.
Purdue Research Computing has a mission not only to support faculty in research, but also to introduce students to HPC. In the last several years, Purdue has fielded it's first ever all-women team, sponsored joint teams with other universities and volunteered with the first team made up entirely of high school students. Read more about this year's team: https://bit.ly/2Qjn7rh
Purdue’s Halstead supercomputer serves as a virtual wind tunnel for professor Jonathan Poggie, who studies high-speed fluid mechanics and flow control in the hopes of making supersonic and hypersonic flight more efficient. Halstead’s parallel processing capabilities allow Poggie to model a fluid flow by breaking it up into millions of small grid cells that capture turbulence on a smaller scale, and eliminate the need for him to test his theories in a multi-million dollar physical wind tunnel. Read more about this work: https://bit.ly/2qW3DOt.
Purdue’s Envision Center collaborated with an aeronautics and astronautics class taught by Purdue professor Sarag Saikia to develop a virtual reality simulation of the Martian base designed by the class. When Buzz Aldrin, the second man on the moon, visited Purdue for the Human Journey to Mars forum, he tried out the simulation and proclaimed “Why do you need to go to Mars when you’ve got all this?” Explore Mars yourself in the Purdue booth! Read more: https://bit.ly/350sCyV.
Purdue’s SpaceX Hyperloop Pod Competition team used Purdue’s supercomputers to design an aerodynamic shell with low drag, and to build a magnetic levitation system that fit on their pod and maximized lift while minimizing drag. The team, which was one of only seven that made it through certain checkpoints and was able to test its pod on the track, credits the ability to quickly model many different configurations with its impressive lift-to-drag ratio of 13. Learn more: https://bit.ly/2KlGX1b.
Carlo Scalo, assistant professor of mechanical engineering, uses the Brown supercomputer, for his work i aerodynamics and vortex dynamics. Scalo studies the flow of everything from a comparatively low-speed commercial airliner to a high-speed missile. The higher the speed, the more computational power is required to capture details of the flow. Scalo, who has access to even more powerful supercomputers at the Department of Defense Supercomputing Resource Center, finds himself needing to use them less often as Purdue continues to invest in on-campus computing resources. Learn more: https://bit.ly/2O5tUlm.
Purdue professor Daisuke Kihara used many core hours on two Purdue supercomputers to achieve top results in the CASP and CAPRI protein modeling competitions. Kihara's team ran molecular dynamics simulations to compute the force between every pair of atoms and the movement of atoms and molecules. By simulating how the protein moves under different conditions, the researchers were able to test and refine their structure models. HPC was critical because of the number of simulations that needed to be run. Learn more: https://bit.ly/33Ofa19.
Purdue’s Envision Center and chemistry professor Gaurav Chopra and his students collaborated to develop a virtual reality drug discovery game that allows a player – even one with little knowledge of chemistry – to potentially discover a new treatment for disease by manipulating drug compounds into the binding pockets of protein targets. Learn more: https://bit.ly/33OPCRn
Using Purdue’s powerful supercomputers, Purdue researchers Michael Rossmann and Richard Kuhn created the first detailed, 3-D structural map of the Zika virus, a key step towards developing treatments for the disease. The researchers used cryo-electron microscopy to capture images of many Zika virus particles, and then employed HPC to link similarly oriented views of the virus to each other and build a 3-D image of the virus structure at near-atomic resolution. Learn more: http://bit.ly/2hM1gpg.
Food science may not be a discipline traditionally associated with HPC, but Purdue professor Stephen Lindemann is using Purdue’s high-memory life sciences supercomputer and the latest sequence analysis and genome assembly techniques to make great strides in his research studying how diet affects the composition and function of the microbes in our gut. Lindemann has also taught genome annotation techniques to food science graduate students, many of whom had never used HPC before. Learn more: https://bit.ly/3511UWX.
HPC and advances in plant imaging have presented tremendous opportunities for data collection, but also data management challenges. The SmarterAg™ platform, provided by the Purdue Institute for Plant Sciences and supported by the Purdue-developed HUBzero cyberinfrastructure, provides plant science researchers with a data management infrastructure that will organize data from many disparate streams in a cohesive way for analysis in a secure framework that allows for collaboration, simulation and publication. Learn more: smarteragriculture.org.
MyGeoHub is a powerful, web-based platform that supports geospatial modeling, data analysis and visualization needs of research and education communities through the hosting of groups, data sets, tools, training materials and educational content. Built on the HUBzero open source software stack, MyGeoHub is currently home to four major projects: Global to Local Analysis of Systems Sustainability (GLASS), Useful to Usable (U2U), WaterHUB and Geospatial Data Analysis Building Blocks (GABBS). Learn more: mygeohub.org
Purdue Research Computing staff have successfully set up Windows virtual machines on Linux-based HPC clusters for research teams that need to use Windows-based software. One team, at Purdue’s Agronomy Center for Research and Education, uses drones to collect high-resolution imagery of two 15-acre fields, generating up to 10 GB of data per flight. Using a VM on a Purdue cluster reduced the processing time from 30 hours per data set to 2 hours, and more research time can now be allocated to data analysis and improving data collection techniques.
The Scientific Solutions Group partners with researchers on projects and proposals through developing and optimizing data-driven computational applications, data management solutions and science gateways. The group serves Purdue researchers as well as XSEDE users. Its major projects include DiaGrid, a multi-purpose science gateway built on Purdue’s HUBzero platform, and the NSF-funded Geospatial Data Analysis Building Blocks (GABBs) to support broad dissemination and sharing of geospatial data and tools. Learn more: diagrid.org, mygeohub.org/groups/gabbs.
Jian Jin, an assistant professor in Purdue’s Department of Agricultural and Biological Engineering, has built an innovative handheld sensor that gives plant scientists and farmers a more precise way of measuring the health of crops while gathering up-to-the-minute data that state and federal officials and others will find valuable. Users also have the option to upload the measurements with geo-locations to a web-based cloud map service developed by Purdue IT Research Computing scientist Carol Song. More info: https://bit.ly/2JXkTZ6
Jason Ackerson, an assistant professor of agronomy, uses techniques such as spectroscopy to study the properties of different soils, and generates soil maps that can be fed into simulations such as climate models or hydrologic models. To clean up his data and estimate the accuracy of the resulting maps, he needs to run computationally-intensive models. That's where Data Workbench - an interactive computing environment that provides access to web-based data analysis tools - comes in. "Data Workbench has taken something that was a real chore, and it's now a trivial task," says Ackerson. It's made the science a lot faster." Learn more: https://bit.ly/2CHrrsb.
Purdue University's community cluster program, which has included 18 major systems since its inception in 2004, has become the reference model for campus computing nationwide. Nearly 200 faculty partners and their students from Purdue's three campuses, all primary colleges and 59 departments, use these clusters for research in the sciences, engineering and social sciences. Learn more: http://bit.ly/2xzNjAQ.
Community storage resources offered by Purdue Research Computing range from solid-state scratch associated with the individual clusters to the Research Data Depot for working with and sharing active research data sets, as well as Fortress, a tape archive for long-term data storage. Almost 500 research laboratories have purchased over 2.5 PB of storage space in Purdue's Research Data Depot. Learn more: http://bit.ly/2xyYXvy.
Purdue professor Michael Manfra leads the effort at Purdue to build a robust and scalable quantum computer by producing what scientists call a “topological qubit." Purdue is one of four international universities in the collaboration, which is part of Microsoft's global effort to build a quantum computer. More info: https://manfragroup.org/
Purdue’s Envision Center blends technology and art to assist researchers and teachers with data visualization and analysis; virtual simulation and environments for immersive training, research and engagement; human-computer interaction using tracking systems, gesture recognition and haptic devices; and media creation, including animations and publication-quality stills. The Envision Center uses technology such as the HTC Vive, Virtuix Omni and the Microsoft Hololens to develop simulations. Learn more: envision.purdue.edu
The Center for Brain-Inspired Computing (C-BRIC) is a five-year project supported by $27 million in funding from the Semiconductor Research Corp. The mission of C-BRIC is to deliver key advances in cognitive computing, with the goal of enabling a new generation of autonomous intelligent systems such as self-flying drones and interactive personal robots. By bringing together a unique team of leading researchers from the fields of machine learning, computational neuroscience, theoretical computer science, neuromorphic hardware, distributed computing, robotics and autonomous systems, C-BRIC will pursue quantum improvements in cognitive systems that will be difficult for these communities to achieve independently. More info: https://engineering.purdue.edu/C-BRIC
The Purdue Center for the Science of Information is a National Science Foundation Science and Technology Center for 10 years with nearly $50 million in funding. The center's focus is on advancing science and technology through a new quantitative understanding of the representation, communication, and processing of information in biological, physical, social, and engineered systems. It brings together accomplished researchers from diverse disciplines to develop a unique multidisciplinary perspective and to formulate solutions with significant broader impact. More info: https://bit.ly/2DzgSK6