Wolfram Computation Meets Knowledge

Computation & Analysis

Programming Minecraft on the Raspberry Pi

The standard Raspbian software on the Raspberry Pi comes with a basic implementation of Minecraft and a full implementation of the Wolfram Language. Combining the two provides a fun playground for learning coding. If you are a gamer, you can use the richness of the Wolfram Language to programmatically generate all kinds of interesting structures in the game world, or to add new capabilities to the game. If you are a coder, then you can consider Minecraft just as a fun 3D rendering engine for the output of your code.

Computation & Analysis

The Shape of the Vote: Exploring Congressional Districts with Computation

In the past few decades, the process of redistricting has moved squarely into the computational realm, and with it the political practice of gerrymandering. But how can one solve the problem of equal representation mathematically? And what can be done to test the fairness of districts? In this post I’ll take a deeper dive with the Wolfram Language—using data exploration with Import and Association, built-in knowledge through the Entity framework and various GeoGraphics visualizations to better understand how redistricting works, where issues can arise and how to identify the effects of gerrymandering.

Announcements & Events

Launching the Wolfram Neural Net Repository

Today, we are excited to announce the official launch of the Wolfram Neural Net Repository! A huge amount of work has gone into training or converting around 70 neural net models that now live in the repository, and can be accessed programmatically in the Wolfram Language via NetModel:
✕ net = NetModel["ResNet-101 Trained on ImageNet Competition Data"]
✕ net[]
Neural nets have generated a lot of interest recently, and rightly so: they form the basis for state-of-the-art solutions to a dizzying array of problems, from speech recognition to machine translation, from autonomous driving to playing Go. Fortunately, the Wolfram Language now has a state-of-the-art neural net framework (and a growing tutorial collection). This has made possible a whole new set of Wolfram Language functions, such as FindTextualAnswer, ImageIdentify, ImageRestyle and FacialFeatures. And deep learning will no doubt play an important role in our continuing mission to make human knowledge computable.
Computation & Analysis

How Optimistic Do You Want to Be? Bayesian Neural Network Regression with Prediction Errors

Neural networks are very well known for their uses in machine learning, but can be used as well in other, more specialized topics, like regression. Many people would probably first associate regression with statistics, but let me show you the ways in which neural networks can be helpful in this field. They are especially useful if the data you're interested in doesn't follow an obvious underlying trend you can exploit, like in polynomial regression.

In a sense, you can view neural network regression as a kind of intermediary solution between true regression (where you have a fixed probabilistic model with some underlying parameters you need to find) and interpolation (where your goal is mostly to draw an eye-pleasing line between your data points). Neural networks can get you something from both worlds: the flexibility of interpolation and the ability to produce predictions with error bars like when you do regression.

Announcements & Events

Learning to Listen: Neural Networks Application for Recognizing Speech

Introduction

Recognizing words is one of the simplest tasks a human can do, yet it has proven extremely difficult for machines to achieve similar levels of performance. Things have changed dramatically with the ubiquity of machine learning and neural networks, though: the performance achieved by modern techniques is dramatically higher compared with the results from just a few years ago. In this post, I'm excited to show a reduced but practical and educational version of the speech recognition problem---the assumption is that we’ll consider only a limited set of words. This has two main advantages: first of all, we have easy access to a dataset through the Wolfram Data Repository (the Spoken Digit Commands dataset), and, maybe most importantly, all of the classifiers/networks I’ll present can be trained in a reasonable time on a laptop.

It’s been about two years since the initial introduction of the Audio object into the Wolfram Language, and we are thrilled to see so many interesting applications of it. One of the main additions to Version 11.3 of the Wolfram Language was tight integration of Audio objects into our machine learning and neural net framework, and this will be a cornerstone in all of the examples I’ll be showing today.

Without further ado, let’s squeeze out as much information as possible from the Spoken Digit Commands dataset!

Education & Academic

Strange Circles in the Complex Plane—More Experimental Mathematics Results

The Shape of the Differences of the Complex Zeros of Three-Term Exponential Polynomials In my last blog, I looked at the distribution of the distances of the real zeros of functions of the form with incommensurate , . And after analyzing the real case, I now want to have a look at the differences of the zeros of three-term exponential polynomials of the form for real , , . (While we could rescale to set and for the zero set , keeping and will make the resulting formulas look more symmetric.) Looking at the zeros in the complex plane, one does not see any obvious pattern. But by forming differences of pairs of zeros, regularities and patterns emerge, which often give some deeper insight into a problem. We do not make any special assumptions about the incommensurability of , , . The differences of the zeros of this type of function are all located on oval-shaped curves. We will find a closed form for these ovals. Using experimental mathematics techniques, we will show that ovals are described by the solutions of the following equation: ... where:
Education & Academic

A Tale of Three Cosines—An Experimental Mathematics Adventure

Identifying Peaks in Distributions of Zeros and Extrema of Almost-Periodic Functions: Inspired by Answering a MathOverflow Question

One of the Holy Grails of mathematics is the Riemann zeta function, especially its zeros. One representation of is the infinite sum . In the last few years, the interest in partial sums of such infinite sums and their zeros has grown. A single cosine or sine function is periodic, and the distribution of its zeros is straightforward to describe. A sum of two cosine functions can be written as a product of two cosines, . Similarly, a sum of two sine functions can be written as a product of . This reduces the zero-finding of a sum of two cosines or sines to the case of a single one. A sum of three cosine or sine functions, , is already much more interesting.

Fifteen years ago, in the notes to chapter 4 of Stephen Wolfram’s A New Kind of Science, a log plot of the distribution of the zero distances... ... of the zero distribution of ---showing characteristic peaks---was shown.