The thing to remember is that what they are developing here is an interface, not an application. Doing things like dropping data into an application or reading a book would be specific to the particular application, not the interface. As an analogy, some Windows applications allow you to drag and drop items into their windows and will process them accordingly -- others will not. It's the Windows interface that allows you the ability to drag and drop, but it's up to the particular application to interpret what to do with the item once it is dropped.
In the video I linked to, he is working on a screen that would be more practical for a cubicle. Size is not be an issue here -- touch screens are available in many sizes. You probably already use a small one when you take out money from an ATM or an even smaller one if you use a PDA. The difference here is how the points of contact are interpreted, and the information is displayed. Also note that this will replace not only your monitor, but also your mouse and keyboard. That'll give you a bit more space to spread out a larger screen.
If you haven't checked out the video I linked, you may want to, as there is at least one thing he does there that I'm sure you will agree has practical applications (perhaps not for your business, but certainly for others). I'm talking specifically about zooming into and entering a 3D map, as well as providing different views such as rainfall or vegetation. But again, the actual application came from (I think) NASA -- they just applied their interface to it to allow for the zooming/rotation via hand gestures.
You are probably right about one thing, though -- they likely can't do some of the more practical stuff yet, simply because the software hasn't been written for the interface yet. They are exploring what can be done with the interface and therefore are developing software which shows off different features of that interface.