Ontology For Biomedical Investigations
The Ontology for Biomedical Investigations (OBI) is an open-access, integrated ontology for the description of biological and clinical investigations. OBI provides a model for the design of an investigation, the protocols and instrumentation used, the materials used, the data generated and the type of analysis performed on it. The project is being developed as part of the OBO Foundry and as such adheres to all the principles therein such as orthogonal coverage (i.e. clear delineation from other foundry member ontologies) and the use of a common formal language. In OBI the common formal language used is the Web Ontology Language (OWL). As of March 2008, a pre-release version of the ontology was made available at the project's SVN repository. __TOC__ Scope The Ontology for Biomedical Investigations (OBI) addresses the need for controlled vocabularies to support integration and joint ("cross-omics") analysis of experimental data, a need originally identified in the transcripto ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Open Access (publishing)
Open access (OA) is a set of principles and a range of practices through which nominally copyrightable publications are delivered to readers free of access charges or other barriers. With open access strictly defined (according to the 2001 definition), or libre open access, barriers to copying or reuse are also reduced or removed by applying an open license for copyright, which regulates post-publication uses of the work. The main focus of the open access movement has been on "peer reviewed research literature", and more specifically on academic journals. This is because: * such publications have been a subject of serials crisis, unlike newspapers, magazines and fiction writing. The main difference between these two groups is in demand elasticity: whereas an English literature curriculum can substitute '' Harry Potter and the Philosopher's Stone'' with a free-domain alternative, such as '' A Voyage to Lilliput,'' an emergency room physician treating a patient for a lif ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Electronic Medical Record
An electronic health record (EHR) is the systematized collection of electronically stored patient and population health information in a digital format. These records can be shared across different health care settings. Records are shared through network-connected, enterprise-wide information systems or other information networks and exchanges. EHRs may include a range of data, including demographics, medical history, medication and allergies, immunization status, laboratory test results, radiology images, vital signs, personal statistics like age and weight, and billing information. For several decades, EHRs have been touted as key to increasing quality of care. EHR combines all patients' demographics into a large pool, which assists providers in the creation of "new treatments or innovation in healthcare delivery" to improve quality outcomes in healthcare. Combining multiple types of clinical data from the system's health records has helped clinicians identify and stratify ch ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Semantic Web
The Semantic Web, sometimes known as Web 3.0, is an extension of the World Wide Web through standards set by the World Wide Web Consortium (W3C). The goal of the Semantic Web is to make Internet data machine-readable. To enable the encoding of semantics with the data, technologies such as Resource Description Framework (RDF) and Web Ontology Language (OWL) are used. These technologies are used to formally represent metadata. For example, Ontology (information science), ontology can describe concepts, relationships between Entity–relationship model, entities, and categories of things. These embedded semantics offer significant advantages such as reasoning engine, reasoning over data and operating with heterogeneous data sources. These standards promote common data formats and exchange protocols on the Web, fundamentally the RDF. According to the W3C, "The Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and commu ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Technical Communication
Technical communication (or tech comm) is communication of technical subject matter such as engineering, science, or technology content. The largest part of it tends to be technical writing, though importantly it often requires aspects of visual communication (which in turn sometimes entails technical drawing, requiring more specialized training). Technical communication also encompasses oral delivery modes such as presentations involving technical material. When technical communication occurs in workplace settings, it's considered a major branch of professional communication. In research or R&D contexts (academic or industrial), it can overlap with scientific writing. Technical communication is used to convey scientific, engineering, or other technical information. Individuals in a variety of contexts and with varied professional credentials engage in technical communication. Some individuals are designated as technical communicators or technical writers as their primary ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Knowledge Engineering
Knowledge engineering (KE) refers to all aspects involved in knowledge-based systems. Background Expert systems One of the first examples of an expert system was MYCIN, an application to perform medical diagnosis. In the MYCIN example, the domain experts were medical doctors and the knowledge represented was their expertise in diagnosis. Expert systems were first developed in artificial intelligence laboratories as an attempt to understand complex human decision making. Based on positive results from these initial prototypes, the technology was adopted by the US business community (and later worldwide) in the 1980s. The Stanford heuristic programming project led by Edward Feigenbaum was one of the leaders in defining and developing the first expert systems. History In the earliest days of expert systems, there was little or no formal process for the creation of the software. Researchers just sat down with domain experts and started programming, often developing the requi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Metabolomics
Metabolomics is the scientific study of chemical processes involving metabolites, the small molecule substrates, intermediates, and products of cell metabolism. Specifically, metabolomics is the "systematic study of the unique chemical fingerprints that specific cellular processes leave behind", the study of their small-molecule metabolite profiles. The metabolome represents the complete set of metabolites in a biological cell, tissue, organ, or organism, which are the end products of cellular processes. Messenger RNA (mRNA), gene expression data, and proteomics, proteomic analyses reveal the set of gene products being produced in the cell, data that represents one aspect of cellular function. Conversely, metabolic profiling can give an instantaneous snapshot of the physiology of that cell, and thus, metabolomics provides a direct "functional readout of the physiological state" of an organism. There are indeed quantifiable correlations between the metabolome and the other cellular ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Proteomics
Proteomics is the large-scale study of proteins. Proteins are vital macromolecules of all living organisms, with many functions such as the formation of structural fibers of muscle tissue, enzymatic digestion of food, or synthesis and replication of DNA. In addition, other kinds of proteins include antibodies that protect an organism from infection, and hormones that send important signals throughout the body. The proteome is the entire set of proteins produced or modified by an organism or system. Proteomics enables the identification of ever-increasing numbers of proteins. This varies with time and distinct requirements, or stresses, that a cell or organism undergoes. Proteomics is an interdisciplinary domain that has benefited greatly from the genetic information of various genome projects, including the Human Genome Project. It covers the exploration of proteomes from the overall level of protein composition, structure, and activity, and is an important component of function ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Transcriptomics
Transcriptomics technologies are the techniques used to study an organism's transcriptome, the sum of all of its RNA, RNA transcripts. The information content of an organism is recorded in the DNA of its genome and Gene expression, expressed through Transcription (genetics), transcription. Here, mRNA serves as a transient intermediary molecule in the information network, whilst non-coding RNAs perform additional diverse functions. A transcriptome captures a snapshot in time of the total transcripts present in a cell (biology), cell. Transcriptomics technologies provide a broad account of which cellular processes are active and which are dormant. A major challenge in molecular biology is to understand how a single genome gives rise to a variety of cells. Another is how gene expression is regulated. The first attempts to study whole transcriptomes began in the early 1990s. Subsequent technological advances since the late 1990s have repeatedly transformed the field and made transcript ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Principal Components Analysis
Principal component analysis (PCA) is a Linear map, linear dimensionality reduction technique with applications in exploratory data analysis, visualization and Data Preprocessing, data preprocessing. The data is linear map, linearly transformed onto a new coordinate system such that the directions (principal components) capturing the largest variation in the data can be easily identified. The principal components of a collection of points in a real coordinate space are a sequence of p unit vectors, where the i-th vector is the direction of a line that best fits the data while being orthogonal to the first i-1 vectors. Here, a best-fitting line is defined as one that minimizes the average squared perpendicular distance, perpendicular Distance from a point to a line, distance from the points to the line. These directions (i.e., principal components) constitute an orthonormal basis in which different individual dimensions of the data are Linear correlation, linearly uncorrelated. Ma ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Data Normalization
In mathematics and computer science, a canonical, normal, or standard form of a mathematical object is a standard way of presenting that object as a mathematical expression. Often, it is one which provides the simplest representation of an object and allows it to be identified in a unique way. The distinction between "canonical" and "normal" forms varies from subfield to subfield. In most fields, a canonical form specifies a ''unique'' representation for every object, while a normal form simply specifies its form, without the requirement of uniqueness. The canonical form of a positive integer in decimal representation is a finite sequence of digits that does not begin with zero. More generally, for a class of objects on which an equivalence relation is defined, a canonical form consists in the choice of a specific object in each class. For example: *Jordan normal form is a canonical form for matrix similarity. *The row echelon form is a canonical form, when one considers as equ ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Electrophoresis
Electrophoresis is the motion of charged dispersed particles or dissolved charged molecules relative to a fluid under the influence of a spatially uniform electric field. As a rule, these are zwitterions with a positive or negative net charge. Electrophoresis is used in laboratories to separate macromolecules based on their charges. The technique normally applies a negative charge called cathode so anionic protein molecules move towards a positive charge called anode. Therefore, electrophoresis of positively charged particles or molecules (cations) is sometimes called cataphoresis, while electrophoresis of negatively charged particles or molecules (anions) is sometimes called anaphoresis. Electrophoresis is the basis for analytical techniques used in biochemistry and molecular biology to separate particles, molecules, or ions by size, charge, or binding affinity, either freely or through a supportive medium using a one-directional flow of electrical charge. It is use ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |