Thursday, August 02, 2012

Two Words on Why I do not trust "The Cloud"



TheGlobe.com and igoogle.

You can likely find quite a few more words to say (Microsoft's various dropped features/software/platforms over the years, come to mind), but I spent a lot of my time creating content for both TheGlobe.com, which was the first host of my personal web page, and igoogle, which I have used for about 5 years as my personal portal.  The former has been gone for a long time and the July 3rd announcment from Google now makes the latter soon defunk.  The time I invested in both was for naught.  With the former, I had backups of the content, but with the latter, no such backup can be made.  I have found netvibes.com to be the closest substitute, but I question how much I want to rely on another "Cloud" solution.

I do not trust "The Cloud", because I think it is foolish to trust private companies with your stuff (or if you are a business, another private company with your business' stuff).  The cloud host may go bankrupt, be bought, change strategies, or just be poorly managed.  Too many things can go wrong and your "stuff" is gone.  Google, Microsoft, Apple, have all had such disappointments for their customers and new start ups are even more risky.

In-sourcing is now the strategy de jour in IT, yet we simultaneously talk about moving applications and data storage to "The Cloud".  I was never a big advocate for outsourcing IT, and I am no fan of "The Cloud".  Outsourcing your IT, your applications, your data certainly has advantages, and if their was some standardized or legislated protection to ensure your investment in time, money, and data were protected indefinitely, I would seriously reconsider my position.  [Side note: If a lawyer ends his practice, his files become part of the state archive he practiced in].  Until then, I feel outsourcing IT is like a car dealership outsourcing its sales force.  Yep, they might save some money, but the outsourced sales guy just wants to sells cars,  if he can sell yours great, but if it is easier or more profitable to sell the cars from the dealership down the road, that is what he is going to do.

If you can't afford to own and host your own data and critical applications then I advise you to reconsider whether you should invest your time and your employees time in generating and maintaining them in the first place.  I will make an exception for common applications like Word, Excel, etc. where the application is on "The Cloud" but the data is ultimately stored in-house.  Even then, however, you take the risk that these "Cloud" applications will remain available, affordable, and compatible with your data for the lifetime of that data's usefulness (which, in the analytical chemistry and research world, could be perpetuity).

By the way, the last time I felt this stongly about a bad idea was when I heard about sub-prime mortages.  I took 80% of my money out of the stock market... then it crashed.

Just the humble opinion of someone twice burned!

Friday, May 09, 2008

Extreme Instruments


Chemical & Engineering News


Scientific instrument makers, often-hidden contributors to great scientific revolutions of the past, now are focusing on development of a new generation of the third most common instrument found in modern chemistry labs, according to an article scheduled for the April 28 issue of Chemical & Engineering News (C&EN), ACS’s weekly news magazine.


These so-called “liquid chromatography” machines rank behind only the laboratory scale and the pH meter as chemistry’s ubiquitous instrument, Senior Editor Mitch Jacoby notes in the C&EN cover story. Chemists use chromatography to analyze complex solutions of chemicals in the search for better medicines, more durable materials, and in a range of other research.


Instrument makers are responding to a critical need for faster, more powerful versions of one particular tool, termed high performance liquid chromatography, or “HPLC,” where the “P” also often can stand for “pressure,” the article says. Jacoby describes the quest for new generations of HPLC tools with the ability to separate chemicals faster and more precisely than ever before. “Extreme” HPLC instruments already are speeding laboratory work in drug companies and other settings, with even better instruments on the horizon, the article suggests.



This story will be available on April 28 at
http://pubs.acs.org/cen/coverstory/86/8617cover.html


FOR ADVANCE INFORMATION, CONTACT:

Michael Bernstein
ACS News Service
Phone: 202-872-6042
Fax: 202-872-4370
Email: m_bernstein@acs.org

Wednesday, February 20, 2008

Kari Dalnoki-Veress receives 2008 John H. Dillon Medal


Springer editor Kari Dalnoki-Veress has been chosen as the 2008 John H. Dillon Medal recipient. He receives the prize "for significant and innovative experiments in glass formation and polymer crystallization at the nanoscale." The medal will be presented to Dalnoki-Veress at the meeting of the American Physical Society from March 10-14, 2008 in New Orleans, USA.

Saturday, November 10, 2007

Scientific authoring and publishing in the new age of multimedia: A Better Idea?

I would like you to indulge me for a moment and consider the following item from a recent entry in the Amazon Daily blog and forwarded here via my Google Reader account. Not only is it interesting from a digital photography standpoint, but it has implications on my vocation as an analytical chemist, but more about that after you read the article.

via Google Reader:

via Amazon Daily by Amazon Daily on 11/6/07

I learned about digital photography metadata from the FBI. They didn't give me a course or anything; they sent out a news release years ago that terrorists may be sending messages encoded into pictures on the Internet. As far as I know, your dgital camera isn't serving Al Qaeda's ends, but it did let me to discover that there's a whole world of information in a JPEG beyond the pictures you see.

You probably know what metadata is -- we use it all the time now, whether putting track names on an MP3 or looking up a library book by its Dewey Decimal number. You may even have used it on your photos, tagging a shot of the beach as "vacation." But it can be staggering to learn how much information there actually is.

Example: This is a JPEG:

But this is the information inside. The shutter speed, the aperture, user-added tags, even latitiude and longitude, they're all in there. Don't worry, though -- people can't find out your address from your online photos unless you want them to, or unless you're careless. There are two groups of metadata important to understand -- EXIF and IPTC. Broadly speaking, EXIF is what is captured on the image at the time you shot it, and IPTC are all the user-added comments afterward. Your EXIF knows what the camera was doing -- when the picture was taken, how fast the shutter went, what lens you used, etc., but it doesn't know that you were taking a picture of your aunt. That's for you to add.

What good is metadata? It's hard to even scratch the surface of the possibilities, but two major benefits are organization and improving your craft by figuring out how other people took photos you enjoy. The benefits for organization are obvious to anyone who's ever tried to sort through a basement full of negatives. The different between sorting through a box that says "Mid-80s" and being able to zoom in on photos you took on April 12 is palpable. If you take advantage of programs that can add metadata -- from free programs like Apple's iPhoto to professional software like Adobe Lightroom -- you can be as organized as you want. Maybe you want to note all of the people in the photos you take. Maybe you're into color and want to track which photos were dominantly red, and which were dominantly green. Maybe you want to track all the photos you took in Hawaii -- whatever you want, the notes affix themselves to the photo, so you'll have them wherever you store it.

But there's more. This is the Web 2.0 age, a time of unprecedented sharing. You don't have to ask a photographer "Hey, what camera did you use to take this?" If you can see the EXIF, you'll know, along with what focal length, ISO, shutter speed and aperture he or she used to make the photo look the way it does. Sites like Flickr or Picasa Web Albums have the EXIF available for public viewing on every photo (at the photographer's discretion), and they can be a great course in learning exactly what sorts of images these parameters can create. Sure, you can know intellectually that a long focal length lens plus a wide aperture equals a shallow range of focus, but isn't it easier to see it?

Here's the EXIF for this image.What does it tell us? The shallow focus comes from the combination of focal length (85mm), aperture (f/1.4) and distance (not shown, but about 15-20 feet). The shutter speed had to be 1/100th to freeze motion (and that was pushing it; faster would have been better if possible), and the ISO had to be bumped up to 1600 to allow that shutter speed to happen indoors. You can learn a lot about a photographer's choices from looking through EXIF.

So what can't you learn? In advanced photography, actually snapping the shutter is the least difficult part -- it's getting to the spot where the picture happens that matters. Either that's a physical journey to exotic locales, the sidelines of the big game, waiting for the perfect sunset, setting up studio lights, or simply getting your subject to react to you in an attractive way. EXIF won't help you there.

Take this photo:

This was taken at 102mm at f/13 and ISO 100. We can learn a few things about the photographer's choices from there -- it's attractively proportioned because of a long focal length, the small aperture and low ISO meant standard room light would be totally black, so this was lit by bright flashes, but we can't go out and recreate this photo just knowing those settings. And that's just a studio shot -- EXIF certainly isn't going to get you a gorgeous shot of the Hindu Kush mountains. Because of that, I hope that photographers are more free with letting their EXIF show on Web-based photos so beginners and intermediate students can learn valuable things about the mechanics of a shot. It's a great way to learn Photography 101, especially for the visually-oriented people who become photographers, but it's not going to cost the pros their jobs.


--Ryan Brenizer

Things you can do from here:



Now consider...

In measurement science, analytical chemistry in particular, data are "converted" into pictures in at least two major ways. First, and most obvious, is they are presented as images called photographs and videos. Second, and perhaps less directly obvious are the pictures of data that we scientists would normally call spectra. In both cases, one can imagine the same metadata as described in this article "enhancing" the photographs and videos taken. It could include all instrumental parameters needed to obtain the image. Moreover, standard operating procedures (SOP's), sample preparation, chain of custody, and every datum necessary to establish the pedigree of the sample studied could be directly embedded with the data of the spectra. Indeed, the entire analyst report could be part of the photograph, video, or spectrum.

Today, we consider such data as parts of reports. The emphasis is on the written word and each written report may contain or reference several images, videos, or spectra that enhance an author's point within a specific passage of a report. My questions are... can we do better and is this approach always the best?

Images, whether they are photographs, videos, or spectra make a powerful impact on the viewer. Indeed, the truism that a picture states a thousand words is unrefuted. Why then, as scientists, do we cling to burying such powerful media in written reports. I think there is a better way.

Consider Amazon again for a moment. When you search for a book, you are presented an emphasis on the image of the book, the title and, lately, hovering over images of books bring a list of links leading you to more information, much of it text but, potentially, some of it more imagery. Amazon does not bury the image of the book within the bounds of prose that might describe the book.

So how would such work in science? Imagine that instead of the National Transportation Safety Committee (NTSC) publishing a long report on an accident that includes a few pictures, it chose one image that highlights a particular aspect of the investigation, or the entire investigation, creates links and metadata within those images that reference and cross-reference items in the report. The table of contents could be a series of images followed by one line of text and the reader would quickly connect with the contents through the powerful imagery being presented.

Many other examples can be imagined. Databases emphasizing image based content already exist with the inclusion of chemical structures, spectra thumbnails, etc. Scientific journals are starting to include images as part of their web-based table of contents, but does this go far enough in encouraging authors to emphasize image-based authoring? Show a spectrum and embed the report rather that the other way around. Relative to publishers, research labs in the chemical and petrochemical industry are lagging significantly in this regard with little capability today on content management, knowledge management, and multimedia publishing systems to enable these approaches. As multimedia publication becomes cheaper, ubiquitous, and more flexible, should we consider the traditional scientific publication format passe?

I hope to have time in the future to create some specific examples emphasizing some of the advantages that I envision such a approach to have.

Thursday, November 01, 2007

Resolving Oligomers from Fully Grown Polymers with IMS-MS

Sarah Trimpin, Manolo Plasencia, Dragan Isailovic, and David E. Clemmer*

Department of Chemistry, Indiana University, Bloomington, Indiana 47405

Anal. Chem., 79 (21), 7965 -7974, 2007. 10.1021/ac071575i S0003-2700(07)01575-2 Web Release Date: September 22, 2007 Copyright © 2007 American Chemical Society

Link to Abstract

Ion mobility and mass spectrometry techniques, combined with electrospray ionization, have been used to examine distributions of poly(ethylene glycols) (PEG) with average molecular masses of 6550 and 17900 Da. The analysis provides information about the polymer size distributions as well as smaller oligomers existing over a wide range of charge states and sizes
(i.e., [HO(CH2CH2O)xH + nCs]n+, where x ranges from 21 to 151 and n = 2 to 11 for the 6550 Da sample; and, x ranges from 21 to 362 and n = 2 to 23 for the 17 900 Da sample).

Liquid NMR probes: Oh so many choices

Vendors offer a wide variety of probes with applications that include synthetic chemistry, protein structure determination, and metabolomics. This product review in Analytical Chemistry reviews some of the choices.

http://pubs.acs.org/subscribe/journals/ancham/79/i21/pdf/1107prodrev.pdf


Analyzing fermented beverages by microCE

RESEARCH PROFILES


Analyzing fermented beverages by microCE

In this issue of Analytical Chemistry
(pp 8162-8169), Richard Mathies and
colleagues at the University of California
Berkeley describe their efforts to
develop a simple point-of-consumption
testing platform by adapting technologies
that they first developed to look for
signs of life on Mars.

Thursday, October 11, 2007

2007 Nobel Prize in Chemistry

The 2007 Nobel Prize in Chemistry has been awarded to Gerhard Ertl (Germany) "for his studies of chemical processes on solid surfaces". Dr. Ertl developed methodology to measure and analyze surface reactions under high-vacuum to prevent the contamination that could invalidate the measurements. Ertl used these techniques to elucidate the specifics of the Haber-Bosch process. The Haber-Bosch process is used to produce plant fertilizer. He also studied the oxidation of carbon monoxide on platinum, which is used to clean automobile emissions

Analytical Sciences Shared Feed Results

Many of the shared feed links below require access to Engineering Village.