September 6, 2018 — Brian Wood, Lead Technical Marketing Writer, Document and Media Systems

Hero

In my previous post, I demonstrated the first step of a multiparadigm data science workflow: extracting data. Now it’s time to take a closer look at how the Wolfram Language can help make sense of that data by cleaning it, sorting it and structuring it for your workflow. I’ll discuss key Wolfram Language functions for making imported data easier to browse, query and compute with, as well as share some strategies for automating the process of importing and structuring data. Throughout this post, I’ll refer to the US Election Atlas website, which contains tables of US presidential election results for given years:

Table

Read More »


July 26, 2018 — Itai Seggev, Senior Kernel Developer, Algorithms R&D

One of the many beautiful aspects of mathematics is that often, things that look radically different are in fact the same—or at least share a common core. On their faces, algorithm analysis, function approximation and number theory seem radically different. After all, the first is about computer programs, the second is about smooth functions and the third is about whole numbers. However, they share a common toolset: asymptotic relations and the important concept of asymptotic scale.

By comparing the “important parts” of two functions—a common trick in mathematics—asymptotic analysis classifies functions based on the relative size of their absolute values near a particular point. Depending on the application, this comparison provides quantitative answers to questions such as “Which of these algorithms is fastest?” or “Is function a good approximation to function g?”. Version 11.3 of the Wolfram Language introduces six of these relations, summarized in the following table.

Read More »


July 19, 2018 — Devendra Kapadia, Kernel Developer, Algorithms R&D

Asymptotic expansions have played a key role in the development of fields such as aerodynamics, quantum physics and mathematical analysis, as they allow us to bridge the gap between intricate theories and practical calculations. Indeed, the leading term in such an expansion often gives more insight into the solution of a problem than a long and complicated exact solution. Version 11.3 of the Wolfram Language introduces two new functions, AsymptoticDSolveValue and AsymptoticIntegrate, which compute asymptotic expansions for differential equations and integrals, respectively. Here, I would like to give you an introduction to asymptotic expansions using these new functions.

Read More »


May 31, 2018 — Sjoerd Smit, Technical Consultant

Neural networks are very well known for their uses in machine learning, but can be used as well in other, more specialized topics, like regression. Many people would probably first associate regression with statistics, but let me show you the ways in which neural networks can be helpful in this field. They are especially useful if the data you’re interested in doesn’t follow an obvious underlying trend you can exploit, like in polynomial regression.

In a sense, you can view neural network regression as a kind of intermediary solution between true regression (where you have a fixed probabilistic model with some underlying parameters you need to find) and interpolation (where your goal is mostly to draw an eye-pleasing line between your data points). Neural networks can get you something from both worlds: the flexibility of interpolation and the ability to produce predictions with error bars like when you do regression.

Bayesian Neural Nets

Read More »


May 24, 2018 — Carlo Giacometti, Kernel Developer, Algorithms R&D

Introduction

Recognizing words is one of the simplest tasks a human can do, yet it has proven extremely difficult for machines to achieve similar levels of performance. Things have changed dramatically with the ubiquity of machine learning and neural networks, though: the performance achieved by modern techniques is dramatically higher compared with the results from just a few years ago. In this post, I’m excited to show a reduced but practical and educational version of the speech recognition problem—the assumption is that we’ll consider only a limited set of words. This has two main advantages: first of all, we have easy access to a dataset through the Wolfram Data Repository (the Spoken Digit Commands dataset), and, maybe most importantly, all of the classifiers/networks I’ll present can be trained in a reasonable time on a laptop.

It’s been about two years since the initial introduction of the Audio object into the Wolfram Language, and we are thrilled to see so many interesting applications of it. One of the main additions to Version 11.3 of the Wolfram Language was tight integration of Audio objects into our machine learning and neural net framework, and this will be a cornerstone in all of the examples I’ll be showing today.

Without further ado, let’s squeeze out as much information as possible from the Spoken Digit Commands dataset!

Spoken Digit Commands dataset

Read More »


March 21, 2018
Patrik Ekenberg, Applications Engineer, Wolfram MathCore
Jan Brugård, CEO, Wolfram MathCore

We are excited to announce the latest installment in the Wolfram SystemModeler series, Version 5.1, where our primary focus has been on pushing the scope of use for models of systems beyond the initial stages of development.

Since 2012, SystemModeler has been used in a wide variety of fields with an even larger number of goals—such as optimizing the fuel consumption of a car, finding the optimal dosage of a drug for liver disease and maximizing the lifetime of a battery system. The Version 5.1 update expands SystemModeler beyond its previous usage horizons to include a whole host of options, such as:

  • Exporting models in a form that includes a full simulation engine, which makes them usable in a wide variety of tools
  • Providing the right interface for your models so that they are easy for others to explore and analyze
  • Sharing models with millions of users with the simulation core now included in the Wolfram Language

Wolfram SystemModeler 5.1

Read More »


March 2, 2018 — Brian Wood, Lead Technical Marketing Writer, Document and Media Systems

Do you want to do more with data available on the web? Meaningful data exploration requires computation—and the Wolfram Language is well suited to the tasks of acquiring and organizing data. I’ll walk through the process of importing information from a webpage into a Wolfram Notebook and extracting specific parts for basic computation. Throughout this post, I’ll be referring to this website hosted by the National Weather Service, which gives 7-day forecasts for locations in the western US:

NOAA website image

Read More »


February 15, 2018 — Jérôme Louradour, Advanced Research Group

FindTextualAnswer featured image

Are you ever certain that somewhere in a text or set of texts, the answer to a pressing question is waiting to be found, but you don’t want to take the time to skim through thousands of words to find what you’re looking for? Well, soon the Wolfram Language will provide concise answers to your specific, fact-based questions directed toward an unstructured collection of texts (with a technology very different from that of Wolfram|Alpha, which is based on a carefully curated knowledgebase).

Let’s start with the essence of FindTextualAnswer. This feature, available in the upcoming release of the Wolfram Language, answers questions by quoting the most appropriate excerpts of a text that is presumed to contain the relevant information.

Read More »


November 9, 2017 — Devendra Kapadia, Kernel Developer, Algorithms R&D

Limits lead image

Here are 10 terms in a sequence:

Table[(2/(2 n + 1)) ((2 n)!!/(2 n - 1)!!)^2, {n, 10}]

And here’s what their numerical values are:

N[%]

But what is the limit of the sequence? What would one get if one continued the sequence forever?

In Mathematica and the Wolfram Language, there’s a function to compute that:

DiscreteLimit[(2/(2 n + 1)) ((2 n)!!/(2 n - 1)!!)^2, n -> \[Infinity]]

Limits are a central concept in many areas, including number theory, geometry and computational complexity. They’re also at the heart of calculus, not least since they’re used to define the very notions of derivatives and integrals.

Mathematica and the Wolfram Language have always had capabilities for computing limits; in Version 11.2, they’ve been dramatically expanded. We’ve leveraged many areas of the Wolfram Language to achieve this, and we’ve invented some completely new algorithms too. And to make sure we’ve covered what people want, we’ve sampled over a million limits from Wolfram|Alpha.

Read More »


October 10, 2017 — Etienne Bernard, Lead Architect, Advanced Research Group

Automated Data Science

Imagine a baker connecting a data science application to his database and asking it, “How many croissants are we going to sell next Sunday?” The application would simply answer, “According to your recorded data and other factors such as the predicted weather, there is a 90% chance that between 62 and 67 croissants will be sold.” The baker could then plan accordingly. This is an example of an automated data scientist, a system to which you could throw arbitrary data and get insights or predictions in return.

One key component in making this a reality is the ability to learn a predictive model without specifications from humans besides the data. In the Wolfram Language, this is the role of the functions Classify and Predict. For example, let’s train a classifier to recognize morels from hedgehog mushrooms:

c = Classify[{

Read More »