San Diego Supercomputer Center
The San Diego Supercomputer Center (SDSC) is an organized research unit of the University of California, San Diego. Founded in 1985, it was one of the five original NSF supercomputing centers. Its research pursuits are high performance computing, grid computing, computational biology, geoinformatics, computational physics, computational chemistry, data management, scientific visualization, cyberinfrastructure, and computer networking. SDSC computational biosciences contributions and earth science and genomics computational approaches are internationally recognized. The current SDSC director is Frank Würthwein, Ph.D., UC San Diego physics professor and a founding faculty member of the Halıcıoğlu Data Science Institute of UC San Diego. Würthwein assumed the role in July 2021. He succeeded Michael L. Norman, also a physics professor at UC San Diego, and who was the SDSC director since September 2010. Divisions and projects SDSC roles include creating and maintaining the Pro ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
University Of California, San Diego
The University of California, San Diego (UC San Diego in communications material, formerly and colloquially UCSD) is a public university, public Land-grant university, land-grant research university in San Diego, California, United States. Established in 1960 near the pre-existing Scripps Institution of Oceanography in La Jolla, UC San Diego is the southernmost of the ten campuses of the University of California. It offers over 200 undergraduate and graduate degree programs, enrolling 33,096 undergraduate and 9,872 graduate students, with the second largest student housing capacity in the nation. The university occupies near the Pacific coast. UC San Diego consists of 12 undergraduate, graduate, and professional schools as well as 8 undergraduate residential colleges. The university operates 19 organized research units as well as 8 School of Medicine research units, 6 research centers at Scripps Institution of Oceanography, and 2 multi-campus initiatives. UC San Diego is als ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
TeraGrid
TeraGrid was an e-Science grid computing infrastructure combining resources at eleven partner sites. The project started in 2001 and operated from 2004 through 2011. The TeraGrid integrated high-performance computers, data resources and tools, and experimental facilities. Resources included more than a petaflops of computing capability and more than 30 petabytes of online and archival data storage, with rapid access and retrieval over high-performance computer network connections. Researchers could also access more than 100 discipline-specific databases. TeraGrid was coordinated through the Grid Infrastructure Group (GIG) at the University of Chicago, working in partnership with the resource provider sites in the United States. History The US National Science Foundation (NSF) issued a solicitation asking for a "distributed terascale facility" from program director Richard L. Hilderbrandt. The TeraGrid project was launched in August 2001 with $53 million in funding to four sites: ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Cyberinfrastructure
United States federal government agencies use the term cyberinfrastructure to describe research environments that support advanced data acquisition, data storage, data management, data integration, data mining, data visualization and other computing and information processing services distributed over the Internet beyond the scope of a single institution. In scientific usage, cyberinfrastructure is a technological and sociological solution to the problem of efficiently connecting federal laboratories, large scales of data, processing power, and scientists with the goal of enabling novel scientific discoveries and advancements in human knowledge. Origin The term National Information Infrastructure had been popularized by Al Gore in the 1990s. This use of the term "cyberinfrastructure" evolved from the same thinking that produced Presidential Decision Directive NSC-63 on Protecting America's Critical Infrastructures (PDD-63). PDD-63 focuses on the security and vulnerability of the ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
E-Science
E-Science or eScience is computationally intensive science that is carried out in highly distributed network environments, or science that uses immense data sets that require grid computing; the term sometimes includes technologies that enable distributed collaboration, such as the Access Grid. The term was created by John Taylor, the Director General of the United Kingdom's Office of Science and Technology in 1999 and was used to describe a large funding initiative starting in November 2000. E-science has been more broadly interpreted since then, as "the application of computer technology to the undertaking of modern scientific investigation, including the preparation, experimentation, data collection, results dissemination, and long-term storage and accessibility of all materials generated through the scientific process. These may include data modeling and analysis, electronic/digitized laboratory notebooks, raw and fitted data sets, manuscript production and draft versions, pre ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Supercomputer Sites
A supercomputer is a type of computer with a high level of performance as compared to a general-purpose computer. The performance of a supercomputer is commonly measured in floating-point operations per second (FLOPS) instead of million instructions per second (MIPS). Since 2022, supercomputers have existed which can perform over 1018 FLOPS, so called Exascale computing, exascale supercomputers. For comparison, a desktop computer has performance in the range of hundreds of gigaFLOPS (1011) to tens of teraFLOPS (1013). Since November 2017, all of the TOP500, world's fastest 500 supercomputers run on Linux-based operating systems. Additional research is being conducted in the United States, the European Union, Taiwan, Japan, and China to build faster, more powerful and technologically superior exascale supercomputers. Supercomputers play an important role in the field of computational science, and are used for a wide range of computationally intensive tasks in various fields, ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
National Digital Information Infrastructure And Preservation Program
The National Digital Information Infrastructure and Preservation Program (NDIIPP) of the United States was an archival program led by the Library of Congress to preserve and provide access to digital resources. The program convened several working groups, administered grant projects, and disseminated information about digital preservation issues. The U.S. Congress appropriated funds to establish the program in 2000, and official activity specific to NDIIPP itself wound down between 2016 and 2018. The Library of Congress was chosen to lead the initiative because of its role as one of the leading providers of high-quality content on the Internet. The Library of Congress formed a national network of partners dedicated to preserving specific types of digital content that is at risk of loss. In July 2010, the Library of Congress launched the National Digital Stewardship Alliance (NDSA) to extend the work of NDIIPP to more partner institutions. The organization, which has been hosted by ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
National Digital Library Program
The National Digital Library Program (NDLP) is a project by the United States Library of Congress to assemble a digital library of reproductions of primary source materials to support the study of the history and culture of the United States. The NDLP brought online 24 million books and documents from the Library of Congress and other research institutions. History Begun in 1995 after a five-year pilot project, the program began digitizing selected collections of Library of Congress archival materials that chronicle the nation's history. In order to reproduce collections of books, pamphlets, motion pictures, manuscripts and sound recordings, the Library has created a range of digital entities: bitonal document images, grayscale and color pictorial images, digital video and digital audio, audio, and searchable e-texts. To provide access to the reproductions, the project developed a range of descriptive elements: bibliographic records, finding aids, and introductory texts and pro ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
RIPE Atlas
RIPE Atlas is a global, open, distributed Internet measurement platform, consisting of thousands of measurement devices that measure Internet connectivity in real time. History RIPE Atlas was established in 2010 by the RIPE Network Coordination Centre. As of April 2022, it was composed of around 12,000 probes and more than 800 anchors around the world. Technical details * Measurement types: The measurement devices (probes and anchors) perform IPv4 and IPv6 traceroute, ping, DNS, NTP and other measurements. * Atlas Probe device types: ** Versions 1 and 2 of the probe: Lantronix XPort Pro ** Version 3 probe: modified TP-Link wireless router (model TL-MR 3020) ** Version 4 probe: NanoPi NEO Plus2 single-board computer ** Version 5 probe: custom design, derived from Turris Mox, developed by CZ.NIC *Atlas Anchor device types **Version 2: Soekris Net6501-70 board in the 1U 19-inch rack-mounted case with additional SSD **Version 3: PC Engines APU2C2/APU2C4 in a 1U 19-inch rack-moun ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
PlanetLab
PlanetLab was a group of computers available as a testbed for computer networking and distributed systems research. It was established in 2002 by Prof. Larry L. Peterson and Prof. David Culler, and by 2005 it had been deployed at 252 sites in 28 countries. Each research project had a "slice", or virtual machine In computing, a virtual machine (VM) is the virtualization or emulator, emulation of a computer system. Virtual machines are based on computer architectures and provide the functionality of a physical computer. Their implementations may involve ... access to a subset of the nodes. Accounts were limited to persons affiliated with corporations and universities that hosted PlanetLab nodes. However, a number of free, public services have been deployed on PlanetLab, including CoDeeN, the Coral Content Distribution Network, and Open DHT. PlanetLab was officially shut down in May 2020 but continues in Europe. References External links PlanetLabPlanetLab Europe Sof ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Storage Resource Broker
Storage Resource Broker (SRB) is data grid management computer software used in computational science research projects. SRB is a logical distributed file system based on a client-server architecture which presents users with a single global logical namespace or file hierarchy. Essentially, the software enables a user to use a single mechanism to work with multiple data sources. Description SRB provides a uniform interface to heterogeneous computer data storage resources over a network. As part of this, it implements a logical namespace (distinct from physical file names) and maintains metadata on data-objects (files), users, groups, resources, collections, and other items in an SRB metadata catalog (MCAT) stored in a relational database management system. (Reprint from November 30 – December 3, 1998) System and user-defined metadata can be queried to locate files based on attributes as well as by name. SRB runs on various versions of Unix, Linux, and Microsoft Windows. The ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Rocks Cluster Distribution
Rocks Cluster Distribution (originally NPACI Rocks) is a Linux distribution intended for high-performance computing (HPC) clusters. It was started by National Partnership for Advanced Computational Infrastructure and the San Diego Supercomputer Center (SDSC) in 2000. It was initially funded in part by an NSF grant (2000–07), but was funded by the follow-up NSF grant through 2011. Distribution Rocks was initially based on the Red Hat Linux (RHL) distribution, however modern versions of Rocks were based on CentOS, with a modified Anaconda installer that simplifies mass installation onto many computers. Rocks includes many tools (such as Message Passing Interface (MPI)) which are not part of CentOS but are integral components that make a group of computers into a cluster. Installations can be customized with additional software packages at install-time by using special user-supplied CDs (called "Roll CDs"). The "Rolls" extend the system by integrating seamlessly and auto ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Argonne National Laboratory
Argonne National Laboratory is a Federally funded research and development centers, federally funded research and development center in Lemont, Illinois, Lemont, Illinois, United States. Founded in 1946, the laboratory is owned by the United States Department of Energy and administered by UChicago Argonne LLC of the University of Chicago. The facility is the largest national laboratory in the Midwestern United States, Midwest. Argonne had its beginnings in the Metallurgical Laboratory of the University of Chicago, formed in part to carry out Enrico Fermi's work on nuclear reactors for the Manhattan Project during World War II. After the war, it was designated as the first national laboratory in the United States on July 1, 1946. In its first decades, the laboratory was a hub for peaceful use of nuclear physics; nearly all operating commercial nuclear power plants around the world have roots in Argonne research. More than 1,000 scientists conduct research at the laboratory, in the ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |