Information theory is increasingly being employed in the study of complex systems, particularly in the fields of neuroscience and artificial life. While domain-specific tools for information analysis are certainly valuable, high-performance and general-purpose toolkits can ensure better reproducibility and faster research turnover. We introduce Inform, an open-source and cross-platform C library for information-theoretic analysis of complex systems. Inform provides a host of functions to estimate information-theoretic measures from time series data. This includes classical information-theoretic measures (e.g. entropy, mutual information) and measures of information dynamics (e.g. active information storage, transfer entropy), but also several less common, yet powerful information-based concepts such as effective information, information flow and integration measures. However, what makes Inform unique is that it exposes a lower-level API allowing users to construct measures of their own, and includes a suite of utility functions that can be used to augment and extend the built-in functionality. Significant effort went into designing Inform's API to make its use from other languages as simple as possible. We describe Inform's overall design and implementation including details of validation techniques and plans for future development. We present evidence that suggests that Inform's computational performance is at least comparable to the Java Information Dynamics Toolkit (JIDT), which is taken to be the gold-standard for the field. We provide several examples to guide users and provide information about higher-level language wrappers for Python, R, Julia and Mathematica.