Free

You are here

Free license for normal use

SketchBio

 

SketchBio aims to provide a rapid-to-use and easy-to-learn 3D modeling tool for biologists to enable effective interactive 3D “what if” scenario exploration for the exploration of subcellular structures.

It includes three novel features: crystal by example, pose-mode physics, and spring-based layout that accelerate operations common in the formation of molecular models. 

Release Date:
Status:
Availability:
Data type:
Techniques:
Software:
Technology:
Platform:
Requirements:

Project development

Institution: University of North Carolina

This work was supported bythe NIH 5-P41-EB002025. Molecular graphics and analyses were performed with the UCSF Chimera package. Chimera is developed by the Resource for Biocomputing, Visualization and Informatics at the University of California, San Francisco (supported by the NIGMS P41-GM103311). 3D rendering was performed using Blender (blender.org). Blender is an open source project supported by the blender Foundation and the online community. Early versions of SketchBio used PyMOL (pumol.org) to import dtata from PDB. PyMOL is an open source project maintained and distributed by Schrodinger. 

RuleBender

RuleBender is a novel visualization system for the integrated visualization, modeling and simulation of rule-based intra-cellular biochemistry. Rule-based modeling (RBM) is a powerful and increasingly popular approach to modeling cell signaling networks. However, novel visual tools are needed in order to make RBM accessible to a broad range of users, to make specification of models less error prone, and to improve workflows. The support of RBM creation, debugging, and interactive visualization expedites the RBM learning process and reduces model construction time; while built-in model simulation and results with multiple linked views streamline the execution and analysis of newly created models and generated networks.

Release Date:
Status:
Availability:
Data type:
Techniques:
Software:
Technology:
Platform:
Requirements:

Project development

Institution: University of Pittsburgh

Work supported by NSF-IIS-0952720, NSF-CCF-0829788, NIH-GM-076570, NIH-UL1-RR024153. We thank the Pitt Visualization Lab, the Faeder Lab and the Emonet Lab for their helpful feedback, and the reviewers for the exciting future work suggestions.

VIPER

VIPER (Visual Pedigree Explorer) is a tool for exploring large complex animal pedigrees and their associated genotype data. The tool combines a novel, space-efficient visualisation of the pedigree structure with an inheritance-checking algorithm. This allows users to explore the apparent errors within the genotype data in the full context of the family and pedigree structure. Ultimately, the aim is to develop an interactive software application that will allow users to identify, confirm and then remove errors from the pedigree structure and scored genotypes. 

Release Date:
Status:
Availability:
Data type:
Techniques:
Software:
Technology:
Platform:
Requirements:

Project development

Institution: The Roslin Institute University of Edinburgh, School of Computing Edinburgh Napier University

LayerCake

LayerCake is a tool designed to assist in the exploration of the genetic variability of the population of viruses at multiple time points and in multiple individuals, a task that necessitates considering large amounts of sequence data and the quality issues inherent in obtaining such data in a practical manner. This design affords the examination of the amount of variability and mutation at each position in the genome for many populations of viruses. This design contains novel visualization techniques that support this specific class of analysis while addressing the issues of data aggregation, confidence visualization, and interaction support that arise when making use of large amounts of sequence data with variable uncertainty. These techniques generalize to a wide class of visualization problems where confidence is not known a priori, and aggregation in multiple directions is necessary.

Release Date:
Status:
Availability:
Data type:
Techniques:
Software:
Technology:
Platform:
Requirements:

Project development

Institution: University of Wisconsin, Madison

This work was supported by NSF awards IIS-0946598 and CMMI-0941013. Related virology research was supported by NIH R01 AI084787.

OpenWalnut

This is a novel and effective method for visualizing probabilistic tractograms within their anatomical context. This illustrative rendering technique, called fiber stippling, is inspired by visualization standards as found in anatomical textbooks. These illustrations typically show slice-based projections of fiber pathways and are typically hand-drawn. Applying the automatized technique to diffusion tractography, it is possible to demonstrate its expressiveness and intuitive usability as well as a more objective way to present white-matter structure in the human brain.

Release Date:
Status:
Availability:
Data type:
Techniques:
Software:
Technology:
Platform:
Requirements:

Project development

Institution: Zuse Institute Berlin, Max Planck Institute for Neurological Research Cologne, University of Leipzig

This work was supported by the German Federal Ministry of Education and Research as part of the VisPME research collaboration (01IH08009F) as well as by the AiF (ZIM grant KF 2034701SS8).

Physioillustration: Interactive molecular illustration

 

Focuses on the illustrative visualization of physiological processes in the molecular scale to provide intuitive visual representation, which the user can observe and interact with.

Release Date:
Status:
Availability:
Data type:
Techniques:
Software:
Technology:
Platform:
Requirements:

Project development

Institution: University of Bergen

The focus of the Physioillustration project is is to develop novel graphics data representations, visual representations, occlusion handling, visual guidance and storytel-ling, zooming, interaction and integration of physiological models and medical imaging. The visualization technology will be developed and evaluated on multiple scale levels, from molecular machines, up to the organ level. This work has been carried out within the PhysioIllustration research project (# 218023), which is funded by the Norwegian Research Council. Additionally, we would like to thank Helwig Hauser and Visualization group in Bergen for useful ideas and feedback.

FluoRender

FluoRender is an interactive tool for neurobiologists to visualize confocal microscopy data in their research. Multiple channels, detailed three-dimensional structures, and time-dependent sequences are the three major features of confocal microscopy data. With these features and usability in mind, we designed and engineered our system, which is now a free package for public download. We present the visualization pipeline and main features of our system for 3D/4D multi-channel confocal data visualization. Our system supports different input formats commonly seen for confocal microscopy. By minimizing pre-processing and optimizing data reading codes, it can read 3D/4D data with minimal latency. It has easy-to-use parameters for volume rendering effects, which are adjusted with real-time speed. It uses several image post-processing methods for detail enhancement, which are applied after volumetric data are rendered, and thus their adjustments are real-time even for 4D sequences. For multi-channel data, our system supports three different blending modes and channel grouping. Users can easily change all the settings and emphasize the most important features.

Release Date:
Status:
Availability:
Data type:
Techniques:
Software:
Technology:
Platform:
Requirements:

Project development

Institution: SCI Institute and the School of Computing, Department of Neurobiology and Anatomy, University of Utah

This publication is based on work supported by Award No. KUS-CI-016-04, made by King Abdullah University of Science and Technology (KAUST), DOE SciDAC:VACET, NSF OCI-0906379, NIH-1R01GM098151-01.

GVF Alignment Viz

By applying gradient vector flow analysis to the MSA data, it is possible to extract and visually emphasize conservations and other patterns that are relevant during the MSA exploration process. In contrast to the traditional visual representation of MSAs, which exploits color-coded tables, the proposed visual metaphor allows us to provide an overview of large MSAs as well as to highlight global patterns, outliers, and data distributions.

Release Date:
Status:
Availability:
Data type:
Techniques:
Software:
Technology:
Platform:
Requirements:

Project development

Institution: Linkoping University, Sweden

This work was supported through grants from the Excellence Center at Linkoping and Lund in Information Technology (ELLIIT), the Swedish Research Council (VR, grant 2011–4113), and the Swedish e-Science Research Centre (SeRC). The presented technique has been integrated into the Voreen volume rendering engine (www.voreen.org), which is an open source visualization framework.

Pages