Using SwiftUI in a production app

The three main challenges we encountered while building an iOS app that makes math graphs audible to blind students

Blind person with mobile phone that plays music notes that correspond a graph in front of him.

What good is a mathematical graph if you cannot see it? Nothing. Together with Royal Visio, the Dutch center of expertise for people with a visual impairment, we created an iOS app that allows blind students to understand graphs with the help of sound: SenseMath. SenseMath not only empowers hundreds of students in mathematics education, it proves that a purely auditory interface can be a rich and intuitive experience for a blind or partially sighted person.

You can learn more about the SenseMath case here, or even download the app yourself. For this blog post I want to focus on how SenseMath is built. Particularly, how SwiftUI is used in the app and what we learned along the way.

To build the interface we used SwiftUI, the first time we did so for a Q42 production app. (My colleague Mathijs wrote a blog post about another SwiftUI app we built recently.) However, not all functionality could be implemented in SwiftUI. This led to several challenges we had to overcome. My article is about the three main challenges we had to face to successfully deliver the SenseMath app.

Formulas

Correctly pronouncing formulas with the accessibility feature VoiceOver, how hard can it be? While building SenseMath, I didn’t give a thought to this. But I quickly ran into a wall. As the app has been developed specifically for VoiceOver users, how can formulas be correctly parsed and pronounced with VoiceOver?

This feature was not supported by a framework in iOS. Simply parsing the formula as described in code, y =  x^2 + 1, was not the way to go. Firstly, because it’s not how a formula should be visualized. It should look like this: y = x2+ 1 . Secondly, it was not pronounced how it should be. It should read “y equals x square plus one” instead of “y caret two plus one”. Because this functionality is the core of SenseMath, we needed this to be done correctly.

While searching for a solution, we found out that Apple itself parses and pronounces formulas correctly in their own Pages app. This finding led to even more questions. How does Apple use it in their own apps? Are they using a private API? Did they build something custom? I concluded it’s the latter. I didn’t find an easy implementation of this functionality. So, I moved to a custom solution as well.

Next to Pages working correctly, I noticed that formatted formulas were pronounced correctly on the web. Unlike in Pages, I could inspect why it was working. There is an old XML format dating back to 1998 which can show and pronounce formulas correctly named MathML. The formula x2 + 4x + 4 =0 could be written as:

<mrow>
    <mrow>
        <msup> <mi>x</mi> <mn>2</mn> </msup> <mo>+</mo>
    </mrow>
    <mn>4</mn>
    <mi>x</mi>
    <mrow>
        <mo>+</mo>
        <mn>4</mn>
    </mrow>
    <mo>=</mo>
    <mn>0</mn>
</mrow>

From here I had a starting point for a solution. There was just one challenge: how could I get a formula typed in the app as x^2 + 4x + 4 = 0 in the XML format shown above? Sure, I could do some regex to try to break down the formula. But this quickly escalated into a huge effort, and I didn’t even take ordering or other logic into account. Luckily there is a JavaScript library called MathJax that does all of this for you. It transforms the formula in the HTML body into the required XML format. Now it’s pronounced correctly on the web, but what does this have to do with our SwiftUI app? We only fixed the  problem on a different platform and in a different programming language.

In order to port this to SenseMath, I had to use WKWebViews. Besides an URL, a WKWebView also has support for plain HTML. This HTML is built using a multiline string where the body can be passed from within the app:

struct MathML {
  var sourceCode: String

  init(label: String, expression: String, scriptEnabled: Bool = true) {
    self.sourceCode = {
      """
        <!DOCTYPE html>
          <html>
          <head>
            <meta name="viewport" content="width=device-width, user-scalable=1">
            <style>
              @media (prefers-color-scheme: dark) {
                body {
                  background-color: rgba(28,28,30,1);
                }

                math * {
                  color: white;
                }
              }
            </style>
            <script src=\"load-mathjax.js\" async></script>
          </head>
          <body>
            \(expression)
          </body>
        </html>
        """
    }()
  }
}

After some cleaning up the result of this HTML, this multiline string can be injected into the WKWebView:

webView.loadHTMLString(sourceCode, baseURL: nil)

While this is the stripped down version of the final implementation, it's the only logic used to render formulas in the app. With some CSS styling we can let the WKWebView behave just like a normal UIVIew. The user won’t even visually notice that it’s a web view.

However, with a slow internet connection this could take a couple of seconds to load. This is because initially the MathJax library was loaded from a URL in the HTML we built. A quick fix for this was to download the whole MathJax and all of the submodules MathJax uses. These files could then be stored in the app to instantly load the formula without any delays. Because parsing formulas didn’t require the remote MathJax library, we got offline support for this feature for free!

Drawing graphs for different screen sizes

The next challenge was to complete the graph for displaying the formula. This was one of the first limitations I ran into with SwiftUI. There was no good native or third party framework that could deliver the result we were aiming for. Luckily, Apple has provided UIHostingControllers. Ican go back to UIKit to build what I want and use it as a SwiftUI view. This is also the path that I chose to draw the grid for the graph. This drawing consisted of multiple steps:

  1. Determine screen size.
  2. Calculate the width of a step size in pixels (size of screen divided by step size of graph).
  3. Start at view coordinates (0, 0) and draw a line from that point to the edge of the graph.
  4. Repeat by upping the start coordinate with the step size in pixels and drawing  a new line.
  5. Do this for both orientations.

To give you a small example of how this works, I abstracted some code from the project:

let bezier = UIBezierPath()

if let horizontalAxisLine = horizontalAxisLine {
  let yorigin = CGFloat(horizontalAxisLine) * bounds.height
  bezier.move(to: CGPoint(x: 0, y: yorigin))
  bezier.addLine(to: CGPoint(x: bounds.width, y: CGFloat(yorigin)))
}

if let verticalAxisLine = verticalAxisLine {
  let xorigin = CGFloat(verticalAxisLine) * bounds.width
  bezier.move(to: CGPoint(x: xorigin, y: 0))
  bezier.addLine(to: CGPoint(x: CGFloat(xorigin), y: bounds.height))
}

path = bezier.cgPath
lineWidth = 1
fillColor = nil
strokeColor = UIColor.label.cgColor

Calculating graph.verticalGridLines and graph.horizontalGridLines correctly took a bit more effort than initially thought. After drawing a simple grid, even more complex edge cases surfaced.

How to draw more bold lines at x = 0 and y = 0? Where to start drawing when x and/or y are negative? How to align everything perfectly so you don’t have half drawn cells because the calculation wasn’t perfect? For the sake of the length of this article, I won’t go over these challenges. I just want to say th​at drawing the grid is somewhat more complicated than I initially thought. If you're facing similar challenges, feel free to drop me an email at furkan@q42.nl. I'd be happy to have a chat about it.

Now that we've successfully drawn the graph, the next challenge is to track the progress of the sound playing. A vertical red line moves from left to right when the graph is playing the audio. This is done similarly to the other lines drawn above with a UIBezierPath. Drawing this line was easy. But how about moving the line parallel to the sound that is playing? The logic of playing audio lives somewhere else in the app. Instead of passing the progress of the audio through the whole app, we solved this with the NotificationCenter in the app.

🇳🇱 👋 Hey Dutchies!
Even tussendoor... we zoeken nieuwe Q'ers!

Not to be confused with the app notifications, NotificationCenter in the apps allows you to send a signal to every corner of the app. It also allows you to listen to specific signals. Every time the audio of the graph is updating, the audio logic sends a Notification with the progress between 0 and 1. The vertical red line, which we call the ProgressLayer, listens to this information and updates its frame accordingly. This appears to the user like it is animating from left to right.

Overall, drawing a grid in SwiftUI turned out to be a bit more complicated than initially thought, but not impossible. Calculating the coordinates of the lines and using UIBezierPath to draw these in order to host this in an UIHostingController seemed the best solution. Which leads us to the last challenge to face: the accessibility rotor.

Use the Rotor in SwiftUI

The accessibility rotor is one of the key features of the app. It allows our users to use the app easily. The rotor gives the users super powers because they can quickly configure the graph for their needs. For example you can quickly start or stop the the sound of the graph. But you can also toggle settings in the graph itself like toggling the sound of intersections between graphs. the time of building SenseMath, the accessibility rotor was not available in SwiftUI. So again we needed to dive back into UIKit in order to offer this functionality.

This approach even dives one step deeper into switching to UIKit than before. It consists of three views:

GraphPlayer: SwiftUI
   ↪ GraphPlayerRepresentable: UIKit
       ↪ GraphPlayerView: SwiftUI

From top to bottom these are the views that are nested inside each other. The GraphPlayerRepresentable serves as an intermediate place which we can use to add custom rotor support. Here is some abstracted code that will clear things a bit up:

struct GraphPlayer: View {
  var body: some View {
    GraphPlayerRepresentable()
      .navigationTitle(graph.name)
      .onDisappear { conductor.stop() }
  }
}

struct GraphPlayerRepresentable: UIViewControllerRepresentable {
  typealias UIViewControllerType = UIHostingController<GraphPlayerView>

  func makeUIViewController(context: Context) -> UIViewControllerType {
    UIHostingController(rootView: GraphPlayerView())
  }

  func updateUIViewController(_ uiViewController: UIViewControllerType, context: Context) {
    uiViewController.accessibilityCustomRotors = [
      // Custom rotors
    ]
  }
}

struct GraphPlayerView: View {
  var body: some View {
    // Actual graph screen
  }
}

By mixing up UIKit and SwiftUI we were able to add the rotor support to SenseMath, without it being supported directly. At the time of writing this article this is not needed anymore, because SwiftUI now provides support for Accessibility Rotors.

Lessons learned

If I could take one lesson from this project, it would be that everything is possible in code. So don’t hesitate to pick up a new framework like SwiftUI. Even though Apple doesn’t always provide you with the tools, you can always work around them and find a solution. Sometimes Apple will catch up and do it for you in the next version of the framework. Like in the case of the Accessibility Rotor which is now fully supported in SwiftUI.

On top of that, you can also find a more efficient way to solve a problem a year later. While writing this article I found out that for drawing a grid, we didn’t have to use UIKit. We could probably use similar logic in the Path object in SwiftUI to get it to work.

Or sometimes, it will never be supported because the need for that feature is too low. But that shouldn’t bother you to implement it with a detour, even if it feels cumbersome. Although it is important to write good code, the goal should always be to provide the best experience possible for your users.


Do you also love working with new technologies like SwiftUI? Check our job vacancies (in Dutch) at werkenbij.q42.nl!