Just like data bytes may become much more richly modulated with attributes (an extension of data provenance; modulating data bytes with additional inspectable elements such as create/review/launch time stamps, and owner, quality, freshness, and controversy properties, etc.), so too may quantitative data sets.
There should be a ‘2.0 format’ standardized toolkit for quantitative data analysis that includes the top ten techniques often used to analyze data sets. These tools should be user-friendly, ideally as a widget overlay to websites, or otherwise easily accessible and usable by non-quant laypersons.
Suggested techniques for inclusion in the top ten most-useful data analysis tools:
- Fourier transforms
- Markov state models
- Entropy analysis
- Distribution analysis (e.g.; power law, Gaussian, etc.)
- Progression analysis (e.g.; linear, geometric, exponential, discontinuous)
- Qualitative math
- Network node/group theory/graphing theory analysis
- Complexity, chaos, turbulence, and perturbation modeling