In recent developments Offline VDI has gained much interest. Offline support has been a common request of Citrix for many years. Until virtualization came around, it looked pretty hard to accomplish this goal in a believable way.
I was busy searching for interesting old ideas in the Citrix Product Ideas database and stumbled across Idea 420 entitled “Offline access”. This idea came straight from a customer and I just tried to capture it and form some guesses about how it would be done.
For amusement purposes only, I’m including what I wrote for that idea. Remember that this happened May 22, 2003 right after iForum in Sydney.
I was asked for this at iForum by a customer.
His assumption was that if he was working on something that he could continue to work on it even if he was not online. His then would later sync up whatever work he had done when he connected again.
This might work fine in a file-based world, but does not carry forward very well in Citrix’s area of expertise.
It brings forward the idea of what kind of work a user would be doing remotely AND offline.
Most common practices are to either work on documents or presentations.
It would be rare for them to expect that they would be able to talk to a server since they are indeed offline.
So, they are only confused about how the work is being done. If they knew that they were really working online all the time with the server and the software being on the server, they would not ask for offline access since that would mean they would know that they could not run the app on their client.
But, they do not seem to know this. There is nothing wrong with this since it really shows how transparent we have made things work for them.
So, there really only seems to be a few options.
1. Tell them to install the server app software on their client and then manually copy the files back and forward
2. Create a process for automatically installing apps that will be run offline on the client and have a sync method to go between server and client
3. Pretend that we are the software and treat everything like a generic case and later sync up any changes [??? don’t know what this means in 2008]
4. Create a new model that makes it less obvious where things are running so that it can happen on the server or client with or without a connection
5. A miracle
I think I like number 2 the best. It is the most practical and I would think that Microsoft would probably chose that kind of model. Of course this means that you would probably have to install the likes of Office on clients that might not support it, but at least it would run as the customer would expect. The syncing feature would also be useful since it would mean that it would probably be better than what people do normally when they go on the road. There is even a good chance that these users would already have the native apps on their client and all they really want is the ability to work on these documents/presentations in a more transparent fashion. For example, if I am offline, use the copy I checked out. If I am online, use the copy from the server that I have just checked in again. I suppose that a library model (books) would work well in this case. This concept would not work well in a database type environment where the file is huge and many people might be working on it. The best you could do is check out a section of the database that you wanted to work on (if this was possible) and then take it on the road. I think that people are still going to get confused but we can do a lot to improve the situation when the user is offline.
Returning to 2008…
I got a good chuckle on option number 5. This all happened before the advent of the possibility of doing Offline VDI for such a trick. Things have become much more apparent since then. It’s still not completely clear but it is not as far off as 2003 was.
Even with the current visions, it is not obvious how this could be made transparent. The ultimate model would call for dual execution so that the local copy would take over when the network is lost. At the current pace this is still years away and could require a remodeling of how applications work. In the non-Windows world this is already happening for the sake of web applications. Newer applications are becoming more tolerant of offline access.
The current vision focuses on taking a VM, checking it out, and running it on a laptop. Once the offline mode is over, the user then needs to check the changes back in. The good news is that this concept is very easy to explain and understand. The bad news is that it is difficult to achieve and tends to ignore the limitations of moving and syncing VMs. Baby steps moves the technology forward regardless of the pace.
Sometimes (if not most) it is just as important to sell the pitch to the consumers in a simple way instead of how it ultimately might be deployed. In other words, engineers often make the mistake of being technically accurate and usually deluge non-technical consumers (including internal non-engineers). This tends to polarize internal opinions to match the simpler model since the more precise model cannot be understood.
Having Offline VDI support would benefit Citrix customers based on the assumption that offline access has been an outstanding requirement for many many years. Obviously if people are asking for it over and over, it must have a valid business model that would help both sides.
I’m confused why you would implement a VDI as a solution to their problem. Why not just Stream the applications down to the system with XenApp, and call it a day. Then the have access to their application when they are offline, and then when they get back online everything resynchs. That is unless your talking about a web application, but that is where Google Gears or Adobe Air comes into play. And all of this technology my friend is available today. See here for more info: http://www.adobeairtutorials.com/2007/08/26/google-gears/ and http://blogs.zdnet.com/Stewart/?p=688 Also since you were at BriForum I recommend that you take a look at the video for the Future of Client Computing session here: http://www.briforum.com/BriForum-2008-Chicago/session.asp?id=371
I’ve thought about what you said and the answer seems to be that bringing down just the applications is not enough. There would be a need to track a whole environment that the user is used to using in the data center. It is not just about the individual applications but rather additional user files and whatever else baggage is necessary to get things working.
The key message is that application streaming and virtualization can only carry you so far. Sometimes you just need to punt and keep the whole environment with you. This would be especially true for the potential to run on non-Windows platforms like Linux and Mac.
I can see this debate going on and on if it weren’t for the fact that I was a bit slack for answering this.
Yes, I was at the Future of Client Computing session. I’m not quite sure why you mentioned it as a basis for streaming. Offline VDI was actually pointed to as a solution that will be coming in just a few years.
Sorry about not getting back sooner.