World Library  
Flag as Inappropriate
Email this Article


Article Id: WHEBN0019332380
Reproduction Date:

Title: NetCDF  
Author: World Heritage Encyclopedia
Language: English
Subject: Silo (library), Simple Ocean Data Assimilation, NetCDF Operators, Rasdaman, Origin (software)
Publisher: World Heritage Encyclopedia


Network Common Data Form
Filename extensions .nc   .cdf
Internet media type application/netcdf
Magic number CDF\001
Developed by UCAR
Type of format scientific binary data
Extended from CDF

NetCDF (Network Common Data Form) is a set of software libraries and self-describing, machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. The project homepage is hosted by the Unidata program at the University Corporation for Atmospheric Research (UCAR). They are also the chief source of netCDF software, standards development, updates, etc. The format is an open standard. NetCDF Classic and 64-bit Offset Format are an international standard of the Open Geospatial Consortium.[1]

The project started in 1989 and is still actively supported by UCAR. Version 3.x (released in 1997) is still widely used across the world and maintained by UCAR (last update 2012). Version 4.0 (released in 2008) allows the use of the HDF5 data file format. Version 4.1 (2010) adds support for C and Fortran client access to specified subsets of remote data via OPeNDAP. Both Version 3 and Version 4 are planned to be updated by UCAR.

The format was originally based on the conceptual model of the Common Data Format developed by NASA, but has since diverged and is not compatible with it.[2]

Format description

The netCDF libraries support 3 different binary formats for netCDF files:

  • The classic format was used in the first netCDF release, and is still the default format for file creation.
  • The 64-bit offset format was introduced in version 3.6.0, and it supports larger variable and file sizes.
  • The netCDF-4/HDF5 format was introduced in version 4.0; it is the HDF5 data format, with some restrictions.

All formats are "self-describing". This means that there is a header which describes the layout of the rest of the file, in particular the data arrays, as well as arbitrary file metadata in the form of name/value attributes. The format is platform independent, with issues such as endianness being addressed in the software libraries. The data are stored in a fashion that allows efficient subsetting.

Starting with version 4.0, the netCDF API[3] allows the use of the HDF5 data format. NetCDF users can create HDF5 files with benefits not available with the netCDF format, such as much larger files and multiple unlimited dimensions.

Full backward compatibility in accessing old netCDF files and using previous versions of the C and Fortran APIs is supported.


Access libraries

The software libraries supplied by UCAR provide read-write access to netCDF files, encoding and decoding the necessary arrays and metadata. The core library is written in C, and provides an API for C, C++ and two APIs for Fortran applications, one for Fortran 77, and one for Fortran 90. An independent implementation, also developed and maintained by Unidata, is written in 100% Java, which extends the core data model and adds additional functionality. Interfaces to netCDF based on the C library are also available in other languages including R (ncdf,[4] ncvar and RNetCDF[5] packages), Perl, Python, Ruby, Haskell,[6] Mathematica, MATLAB, IDL, and Octave. The specification of the API calls is very similar across the different languages, apart from inevitable differences of syntax. The API calls for version 2 were rather different from those in version 3, but are also supported by versions 3 and 4 for backward compatibility. Application programmers using supported languages need not normally be concerned with the file structure itself, even though it is available as open formats.


A wide range of application software has been written which makes use of netCDF files. These range from command line utilities to graphical visualization packages. A number are listed below, and a longer list[7] is on the UCAR website.

  • A commonly used set of Unix command line utilities for netCDF files is the NetCDF Operators (NCO) suite, which provide a range of commands for manipulation and analysis of netCDF files including basic record concatenating, slicing and averaging.
  • ncBrowse[8] is a generic netCDF file viewer that includes Java graphics, animations and 3D visualizations for a wide range of netCDF file conventions.
  • ncview[9] is a visual browser for netCDF format files. This program is a simple, fast, GUI-based tool for visualising fields in a netCDF file. One can browse through the various dimensions of a data array, taking a look at the raw data values. It is also possible to change color maps, invert the data, etc.
  • Panoply[10] is a netCDF file viewer developed at the NASA Goddard Institute for Space Studies which focuses on presentation of geo-gridded data. It is written in Java and thus platform independent. Although its feature set overlaps with ncBrowse and ncview, Panoply is distinguished by offering a wide variety of map projections and ability to work with different scale color tables.
  • The NCAR Command Language is used to analyze and visualize data in netCDF files (among other formats).
  • PyNIO[11] is a Python programming language module that allows read and/or write access to a variety of data formats, including netCDF.
  • Ferret is an interactive computer visualization and analysis environment designed to meet the needs of oceanographers and meteorologists analyzing large and complex gridded data sets. Ferret offers a Mathematica-like approach to analysis; new variables may be defined interactively as mathematical expressions involving data set variables. Calculations may be applied over arbitrarily shaped regions. Fully documented graphics are produced with a single command.
  • The Grid Analysis and Display System (GrADS)[12] is an interactive desktop tool that is used for easy access, manipulation, and visualization of earth science data. GrADS has been implemented worldwide on a variety of commonly used operating systems and is freely distributed over the Internet.
  • nCDF_Browser[13] is a visual nCDF browser, written in the IDL programming language. Variables, attributes, and dimensions can be immediately downloaded to the IDL command line for further processing. All the Coyote Library[14] files necessary to run nCDF_Browser are available in the zip file.
  • ArcGIS versions after 9.2[15] support netCDF files that follow the Climate and Forecast Metadata Conventions and contain rectilinear grids with equally-spaced coordinates. The Multidimensional Tools toolbox can be used to create raster layers, feature layers, and table views from netCDF data in ArcMap, or convert feature, raster, and table data to netCDF.
  • Origin 8 software imports netCDF files as matrix books where each book can hold a 4D array. Users can select a subset of the imported data to make surface, controur or image plots.
  • The Geospatial Data Abstraction Library provides support[16] for read and write access to netCDF data.

Common uses

It is commonly used in climatology, meteorology and oceanography applications (e.g., weather forecasting, climate change) and GIS applications.

It is an input/output format for many GIS applications, and for general scientific data exchange. To quote from their site:[17]

"NetCDF (network Common Data Form) is a set of interfaces for array-oriented data access and a freely-distributed collection of data access libraries for C, Fortran, C++, Java, and other languages. The netCDF libraries support a machine-independent format for representing scientific data. Together, the interfaces, libraries, and format support the creation, access, and sharing of scientific data."


The Climate and Forecast (CF) conventions are metadata conventions for earth science data, intended to promote the processing and sharing of files created with the NetCDF Application Programmer Interface (API). The conventions define metadata that are included in the same file as the data (thus making the file "self-describing"), that provide a definitive description of what the data in each variable represents, and of the spatial and temporal properties of the data (including information about grids, such as grid cell bounds and cell averaging methods). This enables users of data from different sources to decide which data are comparable, and allows building applications with powerful extraction, regridding, and display capabilities.


An extension of netCDF for parallel computing called Parallel-NetCDF (or PnetCDF) has been developed by Argonne National Laboratory and Northwestern University.[18] This is built upon MPI-IO, the I/O extension to MPI communications. Using the high-level netCDF data structures, the Parallel-NetCDF libraries can make use of optimizations to efficiently distribute the file read and write applications between multiple processors. The Parallel-NetCDF package can read/write only classic and 64-bit offset formats. Parallel-NetCDF cannot read or write the HDF5-based format available with netCDF-4.0. The Parallel-NetCDF package uses different, but similar APIs in Fortran and C.

Parallel I/O in the Unidata netCDF library has been supported since release 4.0, for HDF5 data files. Since version 4.1.1 the Unidata NetCDF C library supports parallel I/O to classic and 64-bit offset files using the Parallel-NetCDF library, but with the NetCDF API.

Interoperability of C/Fortran/C++ libraries with other formats

The netCDF C library, and the libraries based on it (Fortran 77 and Fortran 90, C++, and all third-party libraries) can, starting with version 4.1.1, read some data in other data formats. Data in the HDF5 format can be read, with some restrictions. Data in the HDF4 format can be read by the netCDF C library if created using the HDF4 Scientific Data (SD) API.

NetCDF-Java common data model

The NetCDF-Java library currently reads the following file formats and remote access protocols:

There are a number of other formats in development. Since each of these is accessed transparently through the NetCDF API, the NetCDF-Java library is said to implement a Common Data Model for scientific datasets.

The Common Data Model has three layers, which build on top of each other to add successively richer semantics:

  1. The data access layer, also known as the syntactic layer, handles data reading.
  2. The coordinate system layer identifies the coordinates of the data arrays. Coordinates are a completely general concept for scientific data; specialized georeferencing coordinate systems, important to the Earth Science community, are specially annotated.
  3. The scientific data type layer identifies specific types of data, such as grids, images, and point data, and adds specialized methods for each kind of data.

The data model of the data access layer is a generalization of the NetCDF-3 data model, and substantially the same as the NetCDF-4 data model. The coordinate system layer implements and extends the concepts in the Climate and Forecast Metadata Conventions. The scientific data type layer allows data to be manipulated in coordinate space, analogous to the Open Geospatial Consortium specifications. The identification of coordinate systems and data typing is ongoing, but users can plug in their own classes at runtime for specialized processing.

See also


  1. ^ "OGC standard netCDF Classic and 64-bit Offset". Retrieved 2013-11-27. 
  2. ^ "Background - The NetCDF Users' Guide". Retrieved 2013-11-27. 
  3. ^ "Version 4.0 of the netCDF API". Retrieved 2013-11-27. 
  4. ^ "ncdf". 2013-08-06. Retrieved 2013-11-27. 
  5. ^ "Rnetcdf". 2012-07-19. Retrieved 2013-11-27. 
  6. ^ "hnetcdf: Haskell NetCDF library". 2014-07-10. 
  7. ^ russ (1990-01-01). "List of software utilities using netCDF files". Retrieved 2013-11-27. 
  8. ^ "ncBrowse". Retrieved 2013-11-27. 
  9. ^ "ncview". Retrieved 2013-11-27. 
  10. ^ "Panoply". Retrieved 2013-11-27. 
  11. ^ "PyNIO". 2011-07-28. Retrieved 2013-11-27. 
  12. ^ "GrADS Home Page". Retrieved 2013-11-27. 
  13. ^ "Coyote's Guide to IDL Programming". 2013-11-23. Retrieved 2013-11-27. 
  14. ^ "Coyote Library". 2013-11-23. Retrieved 2013-11-27. 
  15. ^ "ArcGIS version 9.2". Retrieved 2013-11-27. 
  16. ^ "NetCDF network Common Data Form". Retrieved 2013-11-27. 
  17. ^ "What Is netCDF?". Unidata Program Center. Retrieved 2012-11-26. 
  18. ^ "parallel-netcdf". 2013-11-17. Retrieved 2013-11-27. 
  19. ^ [1]
  20. ^ [2]
  21. ^ [3]
  22. ^ [4]
  23. ^ "GINI Satellite Format". Retrieved 2013-11-27. 
  24. ^ "Unidata | GEMPAK". Retrieved 2013-11-27. 
  25. ^ [5]
  26. ^ "NetCDF". Retrieved 2013-11-27. 
  27. ^ "NetCDF-4". Retrieved 2013-11-27. 
  28. ^ Steve Ansari. "NCDC: Radar Resources". Retrieved 2013-11-27. 

External links

  • netCDF Release Notes from 3.3 to current version
  • netCDF project at the University Corporation for Atmospheric Research (UCAR)
  • netCDF-Java project at UCAR
  • ncBrowse - generic netCDF file viewer
  • Unidata's Common Data Model version 4
  • "An Introduction to Distributed Visualization"; section 4.2 contains a comparison of CDF, HDF, and netCDF.
  • NCO, a suite of programs known as operators, which facilitate manipulation and analysis of netCDF files
  • dapper data server and DChart web client for OPeNDAP in-situ data in netCDF format
  • Animating NetCDF Data in ArcMap
  • ncWMS, a Web Map Service and dynamic web application for visualizing NetCDF data (demo site)
  • CF Conventions documents
  • CF home page
  • Overview of CF
  • Projects and Groups using CF (partial)
  • List of software utilities using netCDF files

This article is based on material taken from the Free On-line Dictionary of Computing prior to 1 November 2008 and incorporated under the "relicensing" terms of the GFDL, version 1.3 or later.

This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.