IT Stories

October 15, 2019

Research | October 01, 2019
Scientific Machine Learning: How Julia Employs Differentiable Programming to Do it Best
By Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah

Julia, an advanced dynamic programming language for numerical computing, has solved the “two-language problem” by allowing developers to write low-level, high-performance code and high-level “scripting” code in a single language. It is significantly faster than Python and R, and considerably more productive than C, C++, and Fortran. In addition to these successes, Julia continues to bring novel technologies to computational scientists. One of its most exciting new capabilities is differentiable programming (∂P). Recent work to leverage Julia’s compiler allows for the computation of efficient and accurate derivatives of arbitrary Julia programs. This is no mere party trick; program differentiation enables scientists to combine modern machine learning with the computational models developed from centuries of domain knowledge. Much as Julia provides the best of productivity and performance, ∂P offers scientific programmers tremendous benefits—in terms of both performance and accuracy—over black box approaches to machine learning.

At first glance, a casual practitioner might think that scientific computing and machine learning are different fields. Modern machine learning has made its mark through breakthroughs in neural networks. The applicability of such networks to solving a large class of difficult problems has led to the design of new hardware and software that process extremely high quantities of labeled training data while simultaneously deploying trained models in devices. In contrast, scientific computing—a discipline as old as computing itself—tends to use a broader set of modeling techniques that arise from underlying physical phenomena. Compared to the typical machine learning researcher, computational scientists generally work with smaller volumes of data but more computational range and complexity. However, deeper similarities emerge if we move beyond this superficial analysis. Both scientific computing and machine learning would be better served by the ability to differentiate, rather than the building of domain-specific frameworks. This is the purpose of ∂P.1

Figure 1. Taylor series expansion for sin(x).
∂P concerns computing derivatives in the sense of calculus. Scientists have calculated derivatives since Isaac Newton’s time, if not before, and machine learning has now made them ubiquitous in computer science. Derivatives power self-driving cars, language translation, and many engineering applications. Most researchers have likely written derivatives—often painfully by hand—in the past, but derivatives for scientific challenges are increasingly beyond reach.
Julia’s approach has one overarching goal: enable the use domain-specific scientific models as an integral part of researchers’ machine learning stack. This can be in place of—or in addition to—big training sets. The idea is profoundly compelling, as these models embody centuries of human intelligence. Julia makes it possible for scientists to apply state-of-the-art machine learning without discarding hard-won physical knowledge. In a recent blog post, Chris Rackauckas discusses the essential tools of modern scientific machine learning and finds Julia to be the language best suited for scientific machine learning [2].

Julia can perform automatic differentiation (AD) on eigensolvers, differential equations, and physical simulations. Loops and branches do not present obstacles. Researchers calculate derivatives in customized ways for a variety of problem areas, yet ∂P can greatly simplify the experience. We list a few and invite readers to share more:

1. Surrogate modeling: Running scientific simulations is often expensive because they evaluate systems using first principles. Allowing machine learning models to approximate the input-output relation can accelerate these simulations. After training neural networks or other surrogate models on expensive simulations once, researchers can use them repeatedly in place of the simulations themselves. This lets users explore the parameter space, propagate uncertainties, and fit the data in ways that were previously impossible.

2. Adjoint sensitivity analysis: Calculating the adjoint of an ordinary differential equation (ODE) system requires solving the reverse ODE λ′=λ′*df/du+df/dp. The term λ′*df/du is the primitive of backpropagation. Therefore, applying machine learning AD tooling to the ODE function f accelerates the scientific computing adjoint calculations.

Figure 2. Forward and reverse mode automatic differentiation (AD) of the code in Figure 1. The AD packages seamlessly handle loops, conditions, function calls, and much more.
3. Inverse problems: For many parameterized scientific simulations, researchers speculate about the parameters that would make their model best fit. This pervasive inverse problem is difficult because it requires the gradient of a large, existing simulation. One can train a model on a simulator, then use the simulator to quickly solve inverse problems. However, doing so requires generating massive amounts of data for training, which is computationally expensive. Scientists can learn much more quickly and efficiently by differentiating via simulators.
4. Probabilistic programming: Inference on statistical models is a crucial tool. Probabilistic programming enables more complex models and scaling to huge data sets by combining statistical methods with the generality of programming constructs. While AD is the backbone of many probabilistic programming tools, domain-specific languages lack access to an existing ecosystem of tools and packages. In a general-purpose language, ∂P has the benefit of higher composability, access to better abstractions, and richer models.

Julia’s flexible compiler—which can turn generic, high-level mathematical expressions into efficient native machine code—is ideal for ∂P. The Zygote.jl and Cassette.jl packages provide a multipurpose system for implementing reverse-mode AD, whereas ForwardDiff.jl provides forward mode. Julia runs efficiently on central processing units, graphics processing units (GPUs), parallel computers, and Google TPUs (tensor processing units), and is ready for future processors. Because of its composability, the combination of two packages that respectively offer the ability to conduct ∂P and run on GPUs automatically yields the capacity to perform ∂P on GPUs without additional effort.

∂P: Differentiate Programs, not Formulas: sin(x) Example

Figure 3. A trebuchet. Public domain image.
We begin with a very simple example to differentiate sin(x), written as a program through its Taylor series:

The number of terms is not fixed, but instead depends on x through a numerical convergence criterion. This runs on Julia v1.1 or higher (see Figure 1).

While we could have written the Taylor series for sine more compactly in Julia, we used a loop, a conditional, a print, and function calls to illustrate more complex programs. AD simply works, and that is powerful. Let’s compute the gradient at x=1.0 and verify that it matches cos(1.0) (see Figure 2).

∂P: Differentiating a Trebuchet
One may wonder if this concept operates successfully on a more complex example. What if we replace tabulated data with a physical model? An example that has captivated attention—perhaps due to its appearance in battle scenes from Game of Thrones—is the trebuchet, a medieval battle catapult (see Figure 3) [1]. We perform ∂P on the trebuchet, which—despite its deceptively-simple appearance—is a non-trivial system for the purposes of simulation. A differential equation models the distance of a projectile given angle, weight, and wind speed. The human operator can control the projectile’s weight and angle for a set wind speed.

This illustration combines a neural network with the trebuchet’s dynamics (see Figure 4). The network learns the dynamics in a training loop. For a given wind speed and target distance, it generates trebuchet settings (the mass of the counterweight and angle) that we feed into the simulator to calculate the distance. We then compare this distance to our target and backpropagate through the entire chain to adjust the network’s weights. This is where ∂P arises. The training is quick because we have expressed exactly what we want from the model in a fully-differentiable way; the model is trained within a few minutes on a laptop with randomly-generated data. Compared to solving the inverse problem of aiming the trebuchet by conducting parameter estimation via gradient descent (which takes 100 milliseconds), the neural network (five microseconds) is 20,000 times faster. The Trebuchet.jl repository contains all the necessary code and examples.

Figure 4. Training a neural network to learn the dynamics of a trebuchet by incorporating the trebuchet’s ordinary differential equations into the loss function. Figure courtesy of [1].
The DiffEqFlux.jl project, which further explores many of the ideas in neural ODEs, is also noteworthy. For example, it allows use of an ODE as a layer in a neural network and provides a general framework for combining Julia’s ODE capabilities with neural networks.

Leading institutions like Stanford University; the University of California, Berkeley; and the Massachusetts Institute of Technology continue to utilize Julia for both research and teaching in introductory and advanced courses. We hope our new ∂P capabilities will help researchers combine ideas in machine learning with those in science and engineering, and ultimately lead to novel breakthroughs.

1 As 2018 Turing Award winner Yann LeCun said, “Deep learning est mort! Vive differentiable programming.”

Jeff Bezanson, Stefan Karpinski, and Viral B. Shah—creators of the Julia Language—received the James H. Wilkinson Prize for Numerical Software at the 2019 SIAM Conference on Computational Science and Engineering, which took place earlier this year in Spokane, Wash. The prize recognized Julia as “an innovative environment for the creation of high-performance tools that enable the analysis and solution of computational science problems.”

[1] Innes, M., Joy, N.M., & Karmali, T. (2019). Differentiable Control Programs. The Flux Machine Learning Library. Retrieved from
[2] Rackauckas, C. (2019). The Essential Tools of Scientific Machine Learning (Scientific ML). The Winnower 6:e156631.13064. Retrieved from

Further Reading
– Bezanson, J., Edelman, A., Karpinski, S., & Shah, V.B. (2017). Julia: A fresh approach to numerical computing. SIAM Rev., 59(1), 65-98.
– Innes, M. (2019). What is Differentiable Programming? The Flux Machine Learning Library. Retrieved from
– Rackauckas, C., Innes, M., Ma, Y., Bettencourt, J., White, L., & Dixit, V. (2019). DiffEqFlux. Jl – A Julia Library for Neural Differential Equations. Preprint, arXiv:1902.02376.


From IEEE Spectrum, October 2019:


Eight must-ask user testing questions for better UX
October 30, 2018

Here are some examples of great user testing questions that can always offer insight into the mind of the user. And that is the final goal with any user test, isn’t it?
You have your brand-new prototype, ready to be tested by target users. In theory, you hand the prototype over to your users and check how they respond to it. This will lead you to identify any failures or mistakes that can negatively affect the performance of your product. But is anything in life ever that simple?

We all know user testing is important, but what’s the right approach? When are you supposed to carry out your user testing? What are you meant to ask? User testing questions need to be carefully planned if you want to ensure that your data can be trusted.

In truth, there is no right answer. Your user testing will likely depend on several variables ranging from the type of product to your industry and the maturity of your prototype. Just like no one can tell you the one secret to eternal life, no one can claim to have a one-size fits all recipe for perfect user testing questions.

See how you can adapt the concept of the question to your product, to your users. In the end, we all need to understand our users’ wants and needs, no matter the type of product. These questions may not be a recipe for success, but they are a great base to build your user testing plan on.

What is the product for? What is its purpose?
This user testing question gives you clues about how consistent and clear your product is. If the first screen your user sees is pink and filled with cartoon characters, they’ll be surprised to find an accounting software a few screens down the road.

No product out there is perfect – even the simplest items still confuse at least one customer in the world. Take full advantage of your test and identify anything that can mislead users – design elements that don’t fit in, features that don’t add any value to the product, or navigation issues.

The use and purpose of the product should be clear from the start. It’s important to be coherent throughout the whole product so that your customers aren’t wondering what is happening at any point during use.

What did you think should have happened there?
Using a new product can be confusing, even with coherent design. People find it difficult to articulate exactly what they don’t understand, or why.

By asking what the user thinks should have happened, you get a clear sense of the disparity between how things are and how they should be. Answers to this user testing question can give you a path to a more intuitive and logical product. Remember: having a coherent product from start to finish translates to good usability.

Adapt user testing questions to avoid confusion

How would you describe the product in your own words?
After working so hard and creating something out of nothing, we can get stuck in a loop. We see the product in a certain way and we assume others do too – a serious mistake. What matters is how customers and users see your product, and all user testing questions aim to illustrate users’ perception.

People have a tough time describing things in detail, but that isn’t necessarily bad news for you. Even in broad strokes, the users will tell you exactly how your product is as opposed to how you see it yourself.

Why didn’t you use X feature?

Giving users the time to explore and play around with your product without any direction can tell you a lot about the features and their usability. If the user is free to explore and intentionally ignores a certain aspect or feature of your product, you want to know why.

This question is recommended by User Testing Blog because it can open your eyes to the true face of each feature in your product. Is that feature’s presence logical in the product? Is it easy to reach within the product? Many entrepreneurs and startups get so excited about delivering the best possible outcome that they rush to include as many features as possible.

It’s perfectly logical to want a product that is complete in solving a problem for users. But you can also end up with a product crammed with endless features that users just won’t use. No value added, just extra cost. Use your user testing questions to make sure all features come together in harmony and keep an eye out for features that can be improved or altogether eliminated.

Who do you know that would like this product?
This is a clever way to ask the user to describe your product, minus any social pressure. Beautifully explained by Purple Design’s Chris Gallello, this user testing question has lots to offer.

In truth, users don’t usually like telling you that they don’t like the product even during user testing sessions – so give them a comfortable way of doing so. Instead of asking if the user would use your product, ask them about their friends and family.

Imagine a user tells you their brother would most likely use and enjoy your product. Maybe their brother isn’t your target user, but it does tell you a lot about how the user perceives the product.

“What is your brother like?” will render an insight into the link between the brother and the product. If the user says the brother is straightforward, no nonsense and analytical, then that is how they see your product. Is this how you want your product to been seen? Is there a great disparity between the brother’s description and your target market?

Can you think of any other product that resembles this one?
If you’re carrying out user testing, you’ve most likely already identified your main competitors. But it can be insightful to hear from users exactly what products out there they think can match or compare to yours.

Users have an interesting way of drawing comparison among products. Their reasoning can include functionality, design, pricing or some personal reason why your product reminds them of another product.

This can be useful in both finding other possible competitors you may have overlooked, and finding products that evoke the same design and personality. You could probably learn a thing or two from them. Well-planned user testing questions can bring all sorts of benefits!

What device do you see yourself using for this product?
No company can afford to overlook the importance of the devices on which your product can be used. Be it tablets, smartphones or regular computers – it is important to know which one could fit in with your target users.

App prototype seen on different devices for user testing questions

Aside from a matter of design, the device used for your product can impact how long users are on it, what they do it with it and how often they come back. Take the opportunity during your user testing to see on which devices you can expect users to utilize and adapt your design accordingly.

List out 3 things you dislike about the product.
Granted. This isn’t really a question. But just like any other user testing question here, this can give you an advantage over the competition. Here is the twist: when you ask this question, show the user your main competitor’s product instead of your own.

This clever trick was created by UX guru and author Laura Klein – you can read more about it in her book “UX for lean startups”. It’s a great read.

Nobody wants to help the competition do better. But being aware of where the competition fell short of user’s expectations provides us with a map of what to avoid. No matter how big or the market share, no product is beyond improvement. Your competitors might have some great features or incredible design, but their product has its flaws – and users can tell exactly what those flaws are.

This can be a creative way to identify gaps in the industry, or aspects that your competitors have neglected. You will be able to improve your product – and offer users everywhere something truly unique that will keep them coming back.

Coming up with the right user testing questions can be a challenge. Just remember that the final goal with this test is to gather insight into user’s minds, their likes and wants. Remember that it’s always best to plan your user testing questions and carry out the test as early as possible. Give your users the space to make their own way around your product.

Once you have your user testing questions planned out, you’ll be closer to a final product your users can love.


October 18, 2019

OCTOBER 18, 2019

Scientists develop a lithium-ion battery that won’t catch fire
by Amanda Zrebiec, Johns Hopkins University

Credit: Johns Hopkins University
A flexible lithium-ion battery designed by a team of researchers from the Johns Hopkins Applied Physics Laboratory and built to operate under extreme conditions—including cutting, submersion, and simulated ballistic impact—can now add incombustible to its résumé.

Current Li-ion batteries are susceptible to catastrophic fire and explosion incidents—most of which arrive without any discernible warning—because they are built with flammable and combustible materials. Samsung Galaxy Note7 phones were banned from airlines as a result of this danger, and the Navy’s prohibition of e-cigarettes on ships and submarines is a direct response to the need to reduce the flammability of such devices.

With these batteries emerging as the energy storage vehicle of choice for portable electronics, electric vehicles, and grid storage, these safety advancements mark a significant step forward in transforming the way Li-ion batteries are manufactured and used in electronic devices.

In research published recently in the journal Chemical Communications, the team, led by Konstantinos Gerasopoulos of APL’s Research and Exploratory Development Department, details its latest discovery: a new class of “water-in-salt” and “water-in-bisalt” electrolytes—referred to as WiS and WiBS, respectively—that, when incorporated in a polymer matrix, reduces water activity and elevates the battery’s energy capabilities and life cycle while ridding it of the flammable, toxic, and highly reactive solvents present in current Li-ion batteries. It’s a safe, powerful alternative, the researchers say.

“Li-ion batteries are already a constant presence in our daily lives, from our phones to our cars, and continuing to improve their safety is paramount to further advancing energy storage technology,” said Gerasopoulos, senior research scientist and principal investigator at APL. “Li-ion battery form factors have not changed much since their commercialization in the early 1990s; we still use the same cylindrical or prismatic cell types. The liquid electrolyte and required hermetic packaging have a lot to do with that.

“Our team’s efforts have generally been focused on replacing the flammable liquid with a polymer that improves safety and form factor. We are excited about where we are today. Our recent paper shows improved usability and performance of water-based flexible polymer Li-ion batteries that can be built and operated in open air.”

Additionally, the damage tolerance initially demonstrated with the team’s flexible battery in 2017 is further improved in this new approach to creating Li-ion batteries.

“The first generation of flexible batteries were not as dimensionally stable as those we are making today,” Gerasopoulos said.

With this latest benchmark reached, the researchers continue to work on further advancements of this technology.

“Our team is continuously improving the safety and performance of flexible Li-ion batteries,” said Jeff Maranchi, the program area manager for materials science at APL. “We have already achieved further discoveries building upon this most-recently reported work that we are very excited about. We hope to transition this new research to prototyping within the year.”