Saturday, April 23, 2011

Advancing The Cloud


I've been a TeraRecon fan for quite a while now. Sadly, I'm not a customer as yet, since we cannot convince our administration to cough up come up with the cash in these hard economic times.

But TeraRecon has recently provided a very economical alternative. How does FREE sound? Yes, for the time being, you can have the power of TeraRecon at no charge.

In a recent article in ImagingEconomics.com, TR president Robert Taylor, PhD, outlines the problem many of us out in the boonies face when trying to implement (or justify) advanced imaging:
Advanced visualization—for all of its benefits—also comes with its challenges, said Robert Taylor, PhD, president and CEO of TeraRecon, who outlines the challenges in this way: it is expensive (you have to have enough volume to justify investment in a complete system); it is an IT deployment (you need an IT department to deploy and manage it); and it is confined to your organization (if you need to reach outside your organization to share or collaborate, it can get complicated).

In order to respond to these challenges, the company announced iNtuition CLOUD late last year. With this new solution in place, new customers can have their iNtuition UNLIMITED licenses hosted within the company's cloud solution. To date, more than 400 users have signed up for evaluation accounts, which were offered free of charge to professional attendees at RSNA '10 and continue to be available at www.terarecon.com/cloud.
I'm one of those 400 users, and I can vouch for its functionality. Maybe it isn't quite as seamless as having an in-house installation, but iNtuition CLOUD does work, and it works very well. For all intent and purpose, you have a full iNtuition product at your disposal via the Internet, I mean Cloud.
On the client side, iNtuition CLOUD functions as a browser link, according to the company. After logging in, the user can start to work with their studies. On the server side, when a user connects and logs in to their account, the cloud establishes a secure link to an appropriate server from the company's global network that has the user's data and is located in a geographically appropriate location.
To be fair, I haven't tried it out on the iPhone or iPad, but I see no reason why it won't work on these devices, as the image above suggests.

Naturally, there are a few areas where one must convince the IT folks and lawyers that this is a kosher approach:
Taylor concedes that one of the biggest challenges is that health care providers have to get comfortable with the legal, regulatory, and security aspects of putting patient data into the cloud and entrusting it to a third party like TeraRecon. He's aware that legal and regulatory departments within health care facilities will need to develop the right contractual and policy frameworks that will allow them to use iNtuition CLOUD and still comply with HIPAA, HITECH, and all of the other applicable obligations. The company's immediate goal is to get health care facilities talking about the possibilities related to iNtuition CLOUD.
Hey, if you can't trust TeraRecon, who can you trust? Stated differently, if you can trust certain larGE companies, you can certainly trust TeraRecon.

Dr. Taylor concludes:
"2011 will be the year that iNtuition CLOUD 'arrives' in a very meaningful way as a part of the advanced visualization landscape."
It has definitely arrived. Anyone looking into advanced visualization should certainly check out iNtuition CLOUD. Having the power of TeraRecon for free is like having Hanukkah in July.


ADDENDUM

Dr. Taylor sent some comments on this post:

The article references 400 users but we're actually past 1000 now and just made a PR about it. You can find a link to the PR at the top of the press releases column on the right of our homepage: www.terarecon.com

iNtuition Cloud is working out really well, and some surprising ways. We are starting to commercialize it now, with several heavy evaluators turning into paying customers, even at their own request - people need a contract in place to use the service formally, I suppose.
But in addition to this, we have found Cloud to be very helpful for our prospective customers to user in a "try before you buy" mode. We used to feel that if we didn't provide a carefully controlled demo with a specialist there in person, we couldn't do our product justice and we may leave a sub-optimal impression, but with Cloud, people are trying it, digging in, figuring things out, asking us questions, and getting a sense of comfort that our solution will work for them, without any major issues. It's very exciting.

I would say that in 2011 so far, I can point to several significant purchases that were driven by Cloud evaluations. One was an extremely famous cancer center that wanted to first make sure we could load their data and perform the specific measurements they needed, and another was a major radiology group that wanted to see how the system could work for their distributed group of interpreting physicians, before jumping in to buy a tool to enable them to offer wider interpretation services, including brain perfusion, cardiac, EVAR, etc.

In some cases, people try the Cloud and buy the Cloud. In others, they try the Cloud and then buy a system. In a couple of cases, we have even sold "Private Clouds" where the customer buys a set of the equipment we use to operate the Cloud, and then sets it up in their own datacenter and operates and controls it for their staff only (e.g. radiologists working from a wide range of remote locations needing access from a browser).

We have also been adding features, like automatic anonymized, compressed DICOM routing to the cloud, relay back to the site of derived results, and PACS integration, so you can launch the Cloud client from a supported PACS system (instead of needing a local client installed).

It's definitely picking up speed and we're having a lot of fun with this!

4 comments :

Martin P said...

I'm sceptical about the nature of 'cloud'. Only last week many services with considerably more visible impact than a TR service (including Reddit, Foursquare, Quora) went down for over a day when the amazon cloud facility disapeared link: CNN

Many folk thought Amazon might have found the formula to reliable cloud but this incident suggests otherwise. The notion post-incident is that one must host ones services with multiple services but there isn't a clear way to manage the routing within such a situation.

There are undoubtedly many benefits to cloud services. Many of those are benefits to the vendor, though, and I'm reminded of the industry 10-12 years ago when vendors pushed browser-based delivery largely to the own benefit, when in fact browsers have only recently (in the last 6-12 months - without adding additional requirements like Java) become capable of adequate delivery.

Will the cloud work? Probably - the benefits are clear, but NO service provider has worked out (yet) how to deliver a truly reliable service. Its not ready yet for a critical service.

Martin P said...

An afternote on the Amazon thing I just bumped into: Company was hosting cardiac patient monitoring on EC2.

It happens.

Robert Taylor said...

The Amazon debacle is very pertinent.

The Amazon model is to take the application as a software image and to replicate it across their global grid of servers and datacenters.

The company who is providing the application doesn't know where it ends up, or where the data it is processing ends up.

At TeraRecon we felt that was an inappropriate way to handle medical records in general and images in particualar.

There are laws and regulations and policies to consider, and we decided we need to be able to clearly state to a customer where their images are, and what measures are in place to protect them.

For that reason, rather than contracting the Cloud hosting to a platform provider like Amazon or Microsoft, we have direct control of specific servers in specific locations, and we can describe physical and logical security, as well as redundancy provisions, and HIPAA accounting, to customers, who need to know.

No system is perfect, but this system is dedicated to providing only one service, and it does get very high quality attention continually, from specialists in medical imaging and imaging informatics.

We're optimistic this approach can offer equal or better uptime and availability than many in-house deployments we have seen. Our team only has to keep one application live in a largely controlled environment, unlike inhouse informatics teams who have to wear multiple hats and fight multiple fires, often against significant headwinds in terms of policy and resources.

It is true that with Cloud, there is a lot more at stake when there's an outage, because many subscribers can be impacted at once, but from any one subscriber's point of view, it's probably not worse than in-house (downtime happens there too), and possibly better, because there's always another datacenter to fail over to. Unless the whole Internet is down, a speedy recovery of basic service should always be possible, even if automatic failover doesn't function as expected.

It is very fair comment however, and this is a voyage of discovery. We are uncovering the challenges and realities and defining the best role for Cloud Computing in medical imaging and advanced visualization.

It won't be a panacea, but it seems to have some value to offer when used as part of a wider imaging informatics strategy.

Martin P said...

Robert.. Thanks for the response. You make a very valid point about in-house deployments which I agree, are often less than optimal.

One particularly positive note about the Amazon 'event' is the level of debate about what went wrong, why, and most importantly, peoples expectations. A couple of examples here and here.

The consensus (otherwise known as a re-learned truism) is that with all the hardware and stack redundancy, the application itself bears the ultimate responsibility for availability.

All that said, I do wish you good speed with one of the few examples of real innovation we see around here.