I’ve been working on iOS versions of some of my development tools, and I have been struggling to figure out why touch UIs are so poor for this kind of application.
A huge problem is that the bottom half of the screen is a dead zone due to the appearance of the on-screen keyboard. Anything displayed where the on-screen keyboard will appear has to be sacrificial and not critical to the task at hand. For instance, in the case of tools that edit text this means you have to be able to view the text around the insertion point and any important context in roughly half the screen. For programming tools where context is very important this is a big problem. This problem is at its worst when the device is in landscape mode.
As a user, you are forced to keep scrolling around in anticipation of the appearance of the on-screen keyboard so you don’t loose the passage text that interests you.
On the iPad Pro this is a little less of an issue as you have more screen space to work with, especially in portrait mode, but its still a challenge. Obviously, a hardware keyboard is one way to solve this problem, but you cannot count on the user having one at hand all the time.