For the 27th time I’ve gotten up on the first business day of January looking forward to a new year with Eigenvector Research. The overall plan doesn’t change much: keep improving the software, provide training in data science/chemometric methods and help clients with their challenging data problems. But the details! So many things to decide! What features to add to our software? What courses to teach, and when? What research to pursue and where to present and publish? The list goes on…
Plus we welcome this week a new staff member, Sean Roginski (pictured at right). Sean completed his M.S. in Data Science at University of North Carolina Wilmington last fall. Prior to that Sean did an internship with us developing a Python software development kit (SDK) to enhance communication with our Solo_Predictor prediction engine. He also constructed an architecture that will help us integrate Python tools into PLS_Toolbox. We really liked what he did, so we hired him! And yes, if his name sounds familiar, he is the son of our Bob Roginski.
The addition of Sean Roginski brings our full time software development and support staff to five, including (in reverse order of seniority) Lyle Lawrence (3 years), Donal O’Sullivan (11 years), Bob Roginski (14 years) and R. Scott Koch (17 years). Given their 50 man-years of experience with our software (Bob and Lyle were PLS_Toolbox users long before they became employees) it’s no surprise that they make our Helpdesk actually helpful.
So we look forward to bringing you new tools and courses in 2021 and, as always, are available to help you with your data science projects.
The Lake Chelan Valley Scholarship Fund has announced 15 award recipients for 2020. The scholarships go to college bound seniors from Chelan Valley schools including Chelan High School and Manson High School, and also renewals for previous award recipients. This year’s crop includes three graduates from this year’s MHS class: Brenda J. Alonso Arreola, Nadia Tejada and Bryce LaMar; five from CHS: Casey Simpson, Tobin Wier, Eli Phelps, Emma McLaren, and Kyle Jackson; and seven renewals: Colt Corrigan, Henry Elsner, Jessica Oules, Jasmine Negrete, Joe Strecker, Sierra Rothlisberger, and Quinn Stamps.
Each recipient will receive $2500 for a total of $37,500 in awards this year. COVID restrictions permitting, checks will be presented to the recipients at the flagpole at Chelan Riverwalk Park at 9am, Saturday, August 8.
LCVSF Board President Betsy Kronschnabel noted “As the years go by, I continue to be impressed with how the majority of applicants are focused on their future plans and have already established clear pathways to their goals. We are really happy to be able to help them move forward.”
The LCVSF was made possible by Doug and Eva Dewar, who wished that their estates be used to help the children of the Chelan Valley. LCVSF was founded in 1991, and in that year five scholarships in the amount of $1000 each were awarded. The fund has grown substantially over the years from contributions from many people, but especially significant contributions from John Gladney, Ray Bumgardner, Don & Betty Schmitten, Marion McFadden, Virginia Husted, the Dick Slaugenhaupt Memorial and Irma Keeney. Now in its 30th year, LCVSF has awarded over $625,000 to Chelan Valley students since its inception.
LCVSF accepts applications from residents of the Chelan valley for undergraduate education. The awards are renewable for up to four years. LCVSF welcomes applications from graduating high school seniors as well as current college students and adults returning to school.
The LCVSF board includes Betsy Kronschnabel (President), Arthur Campbell, III, Linda Mayer (Secretary), Sue Clouse, Barry M. Wise, Ph.D. and John Pleyte, M.D. (Treasurer). For further information, please contact Barry Wise at bmw@eigenvector.com.
We started running “how to” webinars with our chemometrics software in the Fall of 2019. When COVID hit we decided to step up the frequency of these, and to add short courses. Up to now we’ve done ten webinars and five short courses running from 2-5 days each. Both have been quite popular. We’ve had as many as 200 people online for the webinars and 80 for some of our classes.
Having had a positive experience, (over 98% of respondents to our post class surveys say they are likely or highly likely to recommend our online courses), one of our short course attendees asked if I would provide our advice on running these online events. I was happy to do so, then realized that there are others out there that might benefit from our experience. So here it is!
Participant’s On-screen View of a Webinar
Eigenvector Research has never had a central office. We’ve all worked from home for 25+ years now. We started using WebEx to talk to each other long ago, so when we had to choose a platform for webinars we stuck with it. We’ve had good luck with WebEx as a platform. It is maybe a little old and clunky compared to Zoom, (and we wish they’d stop changing their interface), but it is definitely industrial strength and has not been plagued with security issues like Zoom. We don’t do Zoom at Eigenvector.
We record our webinars and classes (more on that below) but we’ve found that people really prefer to attend the live sessions. Besides the possibility of asking questions, there’s just a better connection to the speaker and the content when you’re watching live.
For all our online events we have three key positions, the host, the moderator and the speaker(s). The host is responsible for starting the meeting, controlling who is the current presenter, and introducing the speaker and moderator.
With big groups it really doesn’t work to have people ask questions verbally; it creates too much chaos. We have participants ask questions through the chat feature and the moderator monitors the chat. The moderator then decides what questions to pass to the speaker and when would be a good time to interrupt. Usually, we have the moderator go over these “rules” when he is first introduced.
The speaker or speakers, of course, deliver the main content of the webinar. Having the moderator filter the chat questions for the speaker works well. For shorter talks questions could be held to the end, but interruptions are actually good. When you speak too long on a WebEx without feedback you begin to feel pretty disconnected. So relevant interruptions are welcome. And for these audience, it makes things a lot less monotone to have some back and forth between the moderator and speaker.
I would advise speakers against trying to monitor the chat as they talk. It’s too distracting, better to have the moderator do that. And it’s really great if collaborators of the talk being presented are online at the same time to help answer questions through the chat. There are usually several Eigenvectorians in attendance that can answer questions in the chat that really don’t need to go to the speaker, such as questions specific to the person asking and not that useful for all the participants. And they often provide links to supplementary material that can be quite useful.
Before any online event be sure to practice with the roles and make sure everyone is on board with procedures. It is especially good to practice how to pass presenter privileges and how to activate screen sharing mode. Also, test the audio! We’ve found that headsets work best. Even cheap ones are way better than using your computer microphone.
Speakers should always present their slides in “Presentation” mode so they fill the screen. It’s important that things be as large as possible when transmitted because they get shrunk into a smaller window on the participant’s end (see photo above). And definitely have speakers make up the slides in “Widescreen” format. If you use the old format you get wasted black space on either side of the presentation. If they are going to do anything on their own screen, e.g. software demos, they should change their screen resolution to something lower than native, like 1600 x 900, so everything on it is bigger. Otherwise it can hardly be seen on the participant’s end.
When demoing software, be sure to go slow. Explain what you are doing in words. Hover over buttons while dithering the cursor before clicking. Rest momentarily on fly-out menus before selecting. We’ve been pleased to find that about 75% of the participants that attempt to follow along with our software demos at home are able to do so, but we are working to make that even better!
In WebEx you should have the “Mute on Entry” turned on and “Anyone can share” turned off. Even so you’ll find that people will get their microphones turned back on so you need to have the host or somebody with host privileges ready to track these people down and mute them.
We use video sparingly because it eats up people’s bandwidth. I would suggest that the moderator and speakers turn theirs on right at the beginning just to let people see there are live people on the other end, but then turn it off once they get going.
Unless you are prepared for a big bill, when preparing the WebEx meeting invitation make sure you turn off the setting that gives out the toll-free number. That gets billed back to the WebEx account holder and if you have a lot of international’s participants it adds up fast. We got a $1200 phone bill after doing our first Basic Chemometrics PLUS Metabolomics class. Ouch!
We use the features in WebEx to record the sessions. These good quality but they include all the windows around the speakers screen. So we edit them and crop off the surrounding material before posting them. We use iMovie but there are many other options. It takes a while for WebEx to make the recordings available, in our experience up to 8 hours, so in the end it takes about a day to get them downloaded, edited and posted online.
We tried YouTube and Vimeo for our videos and definitely preferred Vimeo. It’s pretty easy to create “Showcases” of related videos in Vimeo, and to password protect them. (We record all our courses and make the videos available to participants in a password protected Showcase.) Vimeo also provides good analytics on viewership and it’s easy to embed the videos in web pages. For some examples, see our Webinar Series page.
We often provide supplementary videos that are posted with the recorded live events. We use Snagit to record our non-live videos and have had really good luck with it. There are many other options of course.
The chat sessions can be saved and provided along with the recorded videos but they may have to be edited. The saved chat from the host includes all the messages that were sent privately between the host and other individuals. But sometimes there is some valuable stuff in them. It is hard to remember to save them, however, and in WebEx they are very hard to retrieve after the fact!
Of course the MOST important key to a successful online event is to have content that people actually want to see. We’ll leave it to you to figure that out! Good luck with your online endeavors!
I’ve followed this story a bit since first hearing of Theranos and their claims to be able to run hundreds of blood tests simultaneously on just a few drops of blood. Based on my experience this seemed more than unlikely. At Eigenvector we’ve worked on quite a few medical device development projects. This includes projects involving atherosclerotic plaques, cervical cancer, muscle tissue oxygenation, burn wound healing, limb ischemia, non-invasive glucose monitoring, non-invasive blood alcohol estimation and numerous other projects involving blood and urine tests. So we’ve developed an appreciation of how hard it is to develop new analytical techniques on biological samples. Beyond that, we’ve also learned a lot about the error in the reference methods we were trying to match. Even under ideal conditions, with standard laboratory equipment and large sample volumes, results are far from perfect.
So when the whole thing was blown wide open by Carreyrou’s reports in the Wall Street Journal I wasn’t surprised. I read several of the follow up articles as well. But as one reviewer of the book said, “No matter how bad you think the Theranos story was, you’ll learn that the reality was actually far worse.” I’ll say. Honestly it took me a while to get into the book, in fact, I put it down for a month because it just made me so mad.
We’ve had a few consulting clients over the years that were, let’s say, overly enthusiastic. To varying degrees some of them have been unrealistic about the robustness of their technology and have failed to address problems that could potentially impact accuracy. (I’m happy to report that none of our current consulting clients fall into this category.) In some instances things we saw as potential show stoppers were simply declared non-problems. In other cases people abused the data including cherry picking and grossly overfit and non-validated models. (My favorite line was when one client’s lawyer told me I didn’t know how to use my own software.) We have had falling outs with some of these folks when our analysis didn’t support their contentions.
But none of the people we’ve dealt with approached the level of overselling their technology to the degree that Holmes took it. As I see it there are two reasons for this. The first is that Holmes is a sociopath. Carreyrou said he would leave it to others to make that assessment but it seems obvious to me. Maybe she didn’t start out that way, but it’s clear that very early on she started believing her own bullshit. Defending that belief became all that mattered. And she teamed with Ramesh “Sunny” Balwani who was if anything worse. They ran an organization that was based on secrecy, lies and intimidation. And they made sure that nobody on their board had the scientific background to question the feasibility of what they were claiming they’d do.
But the second reason they got as far as they did was because they were exceedingly well connected. The book identifies these connections but doesn’t really discuss them in terms of being enablers of the scam that ensued. It started with Elizabeth’s parents connections to people around Stanford, and Elizabeth’s ChemE professor at Stanford, Channing Robertson. These lead to funding and legal help. From there Holmes just played leapfrog with these connections ending at former secretary of State George Shultz, (who I learned actually lives on the Stanford campus) and his circle including Henry Kissinger, James Mattis and high profile lawyer David Boies. (Boies led the Justice Department’s anti-trust suit against Microsoft, and was Al Gore’s lawyer in he 2000 election.) Famous for his scorched earth tactics, Boies and his firm kept the lid on things at Theranos by threatening lawsuits against potential whistleblowers and further intimidating them by hiring private eyes to surveil them. Having no firmer grasp of the science and engineering realities of what Theranos was attempting than the board members, Boies’ firm did this in exchange for stock options.
It’s second point here that really bothers me. There will always be people like Holmes that are willing to ignore the damage that they may do to others while pursuing wealth or fame and fortune. But this behavior was enabled by well-heeled and well-connected people that failed completely on their due diligence obligations from financial, scientific and most importantly, medical ethics perspectives. Somehow they completely forgot Carl Sagan’s adage “extraordinary claims require extraordinary evidence.” It’s hard to imagine that this could have ever gotten so far out of hand had Holmes attended a state university and had unconnected parents. Investors to whom you are not connected and who are not so wealthy as to be able to afford to lose a lot of money have a much higher standard of proof.
In our consulting capacity at Eigenvector we always try to be optimistic about what’s possible, and we do our best to help clients achieve success. But we never pull our punches with regards to the limitations of the technology we’re working with and the models we develop based on the data it produces. Theranos produced millions of inaccurate blood tests that were eventually vacated. While it doesn’t appear that anybody actually died because of these inaccurate tests, they certainly caused a lot of anxiety, lost time and expense among the customers. It’s our pledge that we will always do our due diligence, and expect those around us to do the same, so that Eigenvector will never be part of a fiasco like this.
As with most of you, we here at Eigenvector are very concerned about the COVID-19 outbreak and its impact on our families, friends, customers and communities. We continue to monitor its evolution closely, especially in regard to our employees and their families. The direct effect of COVID-19 on the daily business of Eigenvector, however, is modest. For over 25 years now everybody, or as we say, EVRIbody in our organization has worked from home. We’re all set up to work remotely as that is how we’ve always done it. We expect to be working our usual hours for the foreseeable future with perhaps a few exceptions to work with and around spouses, children and parents now occupying our homes or needing our assistance.
We recognize that many of our colleagues are experiencing work disruptions and many are now working from home. As experienced home workers we offer here a few tips for making your home office productive and keeping your work/life balance intact.
Start early and try to work regular hours.
Shower, shave and dress as if you were going to the office. (The EigenGuys aren’t very good at this, especially the shaving part.)
If you have a monitor get the level correct and set your keyboard up in a way that supports your wrists.
Get a good chair and resist the urge to work in places that don’t promote good posture.
A headset with noise cancellation and a good microphone is useful for Webex meetings, conference calls, regular phone calls and simply screening out distractions.
Check in with co-workers frequently.
Plan some time for exercise even if it’s just a walk around the block or some stretching and sets of sit-ups and push-ups. I like mid-morning for this.
Try to stay out of the kitchen (this is tough) unless you’re using your break time to start long prep time meals (crock-pot, pizza dough, etc.).
Set a “closing time” and try to stick to it. Spouses/partners can enforce this by setting a beer on the homeworker’s desk at the appointed time. (This always works on me!)
Many of our software users are among these new homeworkers and we are working to accommodate them. In particular, users that work with our floating license versions of PLS_Toolbox or Solo may have trouble reaching EVRI’s Floating License Server when working from home. We have just posted a video addressing this issue: Using Floating License Software Remotely. As always, if you have any questions regarding our software or online courses please contact our helpdesk and we will respond promptly. And to our consulting clients: we are available as always.
Our 15th Annual Eigenvector University, originally scheduled for April 26-May 1, has been postponed until August 16-21. We will of course continue to monitor the situation to determine if additional delays are warranted.
Finally, we also have plans to offer additional resources online. We will step up the frequency of our “EVRI-thing You Need to Know About..” webinar series to twice per month. The next one, “EVRI-thing You Need to Know About Performing PLS-DA” is Wednesday, March 25 at 8:00 am PDT, (16:00 CET). We also plan to offer a couple of our short courses via webinar in the coming months. We will let you know when plans are finalized.
I’ve just returned from Conference Chimiométrie 2020, the annual French language chemometrics conference, now in its 21st edition. The conference was held in Liège and unfortunately there wasn’t much time to explore the city. I can tell you they have a magnificent train station, Gare de Liège is pictured at right.
The vibrant French chemometrics community always produces a great conference with good attendance, well over 120 at this event, and food and other aspects were as enjoyable as ever thanks to the local organizing committee and conference chair Professor Eric Ziemons.
Age Smilde got the conference off to a good start with “Common and Distinct Components in Data Fusion.” In it he described a number of different model forms, all related to ANOVA Simultaneous Components Analysis (ASCA), for determining if the components in different blocks of data are unique or shared. What struck me most about this talk and many of the ones that followed is that there are a lot of different models out there. As Age says, “Think about the structure of your data!” The choice of model structure is critical to answering the questions posed to the data. And it is only with solid domain knowledge that appropriate modeling choices are made.
In addition to a significant number of papers on multi-block methods, Chimiométrie included quite a few papers in the domain of metabolomics, machine learning, Bayesian methods, and a large number of papers on hyper spectral image analysis (see the Programme du Congrès). All in all, a very well rounded affair!
I was pleased to see a good number of posters that utilized our PLS_Toolbox and in some instances MIA_Toolbox software very well! The titles are given below with links to the posters. We’re always happy to help researchers achieve an end result or provide a benchmark towards the development of new methods!
A colleague wrote to me recently and asked if Eigenvector was considering rebranding itself as a Data Science company. My knee-jerk response was “isn’t that what we’ve been for the last 25 years?” But I know exactly what she meant: few people have heard of Chemometrics but everybody has heard about Data Science. She went on to say “I am spending increasing amounts of time calming over-excited people about the latest, new Machine Learning (ML) and Artificial Intelligence (AI) company that can do something slightly different and better…” I’m not surprised. I know it’s partly because Facebook and LinkedIn have determined that I have an interest in data science, but my feeds are loaded with ads for AI and ML courses and data services. I’m sure many managers subscribe to the Wall Street Journal’s “Artificial Intelligence Daily” and, like the Stampeders on Chilkoot Pass pictured below, don’t want to miss out on the promised riches.
Oh boy. Déjà vu. In the late 80s and 90s during the first Artificial Neural Network (ANN) wave there were a slew of companies making similar promises about the value they could extract from data, particularly historical/happenstance process data that was “free.” One slogan from the time was “Turn your data into Gold.” It was the new alchemy. There were successful applications but there were many more failures. The hype eventually faded. One of biggest lessons learned: Garbage In, Garbage Out.
I attended The MathWorks Expo in San Jose this fall. In his keynote address, “Beyond the ‘I’ in AI,” Michael Agostini stated that 80-90% of the current AI initiatives are failing. The main reason: lack of domain knowledge. He used as an example the monitoring of powdered milk plants in New Zealand. The moral of the story: you can’t just throw your data into a ML algorithm and expect to get out anything very useful. Perhaps tellingly, he showed plots from Principal Components Analysis (PCA) that helped the process engineers involved diagnose the problem, leading to a solution.
Another issue involves what sort of data is even appropriate for AI/ML applications. In the early stages of the development of new analytical methods, for instance, it is common to start with tens or hundreds of samples. It’s important to learn from these samples so you can plan for additional data collection: that whole experimental design thing. And in the next stage you might get to where you have hundreds to thousands of samples. The AI/ML approach is of limited usefulness In this domain. First off it is hard to learn much about the data using these approaches. And maintaining parsimony is challenging. Model validation is paramount.
The old adage “try simple things first” is still true. Try linear models. Use your domain knowledge to select sample sets and variables, and to select data preprocessing methods that remove extraneous variance from the problems. Think about what unplanned perturbations might be affecting your data. Plan on collecting additional data to resolve modeling issues. The opposite of this approach is what we call the “throw the data over the wall” model where people doing the data modeling are separate from the people who own the data and the problem associated with it. Our experience is that this doesn’t work very well.
There are no silver bullets. In 30 years of doing this I have yet to find an application where one and only one method worked far and away better than other similar approaches. Realize that 98% of the time the problem is the data.
So is Eigenvector going to rebrand itself as a Data Science company? We certainly want people to know that we are well versed in the application of modern ML methods. We have included many of these tools in our software for decades, and we know how to work with these methods to obtain the best results possible. But we prefer to stay grounded in the areas where we have domain expertise. This includes problems in spectroscopy, analytical chemistry, chemical process monitoring and control. We all have backgrounds in chemical engineering, chemistry, physics, etc. Plus collectively over 100 man-years experience developing solutions that work with real data. We know a tremendous amount about what goes wrong in data modeling and what approaches can be used to fix it. That’s where the gold is actually found.
Eigenvector Research, Inc. (EVRI) was founded by UW Chemical Engineering graduates Barry M. Wise and Neal B. Gallagher on January 1, 1995 and celebrated its 25th anniversary. EVRI is largely based on data science methods in chemistry, i.e.chemometrics, that Wise researched during his graduate work under the supervision of Professors N. Lawrence Ricker and the late Bruce R. Kowalski in Chemistry. Wise developed the first version of Eigenvector’s flagship PLS_Toolbox software as part of his dissertation “Adapting Multivariate Analysis for Monitoring and Modeling Chemical Processes.” The software has evolved over the last three decades into a multi-platform suite of products for building and applying multivariate and machine learning methods for pattern recognition, calibration and classification in the process environment. It’s active user base consists of several thousand people world wide and is used in applications from chemical process control to teaching data science in universities.
Eigenvector provides training on the use of data modeling methods, and has organized hundreds of short courses attended by thousands of students over the last 25 years. It’s annual Eigenvector University in Seattle, now in its 15th year, is a week-long event typically attended by about 50 students. EVRI also provides consulting services and has worked on a wide array of applications from medical devices to the manufacture of high tech machine parts.
Wise and Gallagher started their graduate studies at University of Washington in Autumn 1985. They met the first day of class on September 30. An undergraduate in Chemical Engineering and Engineering Physics at University of Colorado, Gallagher completed his M.S. ChemE at UW in 1987. He went on to University of Arizona where he finished his ChemE Ph.D. in 1992. Wise obtained B.S. degrees in ChemE and Chemistry at UW in 1982 and went on to work for Pacific Northwest National Laboratories (PNNL) in Richland, WA. Returning to UW in 1985, Wise completed his M.S. ChemE in 1987 under Professor Harold Hager and his Ph.D. in 1991.
The two senior members of EVRI’s software development team also have ties to UW: R. Scott Koch graduated with his B.A. in Chemistry in 1995 and Donal O’Sullivan obtained his Ph.D. in Atmospheric Sciences in 1986. They have been with Eigenvector for 17 and 11 years now, respectively, and are a critical part of the Eigenvector team.
For more background and history about Eigenvector Research please see Eigenvector Turns 25. For questions please contact Eigenvector President Barry M. Wise.
Eigenvector Research, Inc. was founded on January 1, 1995 by myself and Neal B. Gallagher, so we’re now 25 years old. On this occasion I feel that I should write something though I’m at a bit of loss with regards to coming up with a significantly profound message. In the paragraphs below I’ve written a bit of history (likely overly long).
PLS_Toolbox Floppy Disks 1994-1997
We started Eigenvector with each of us buying a Power Mac 8100 with keyboard, mouse and monitor. These were about $4k, plus another $1700 to upgrade the 8Mb RAM it came with to 32Mb. Liz Callanan at The MathWorks gave us our first MATLAB licenses-thanks! PLS_Toolbox was in version 1.4 and still being marketed under Eigenvector Technologies. Our founding principle was and still is:
Life is too short to drink bad beer, do boring work or live in a crappy place.
That’s a bit tongue-in-cheek but it’s basically true. We certainly started Eigenvector to keep ourselves in interesting work. For me that meant continuing with chemometrics, data analysis in chemistry. New data sets are like Christmas presents, you never know what you’ll find inside. For Neal I think it meant anything you could do that let you use math on a daily basis. Having both grown up in rural environments and being outdoor enthusiasts location was important. And the bit about beer is just, well, duh!
As software developers we found it both interesting and challenging to make tools that allowed users (and ourselves!) to build successful models for calibration, classification, MSPC etc. As consultants we found a steady stream of projects which required both use of existing chemometric methods and adaptation of new ones. As we became more experienced we learned a great deal about what can make models go bad: instrument drift, differences between instruments, variable and unforeseen background interferents, etc. and often found ourselves as the sanity check to overly optimistic instrument and method developers. Determining what conclusions are supportable given the available data remains an important function for us.
Our original plan included only software and consulting projects but we soon found out that there was a market for training. (This seems obvious in retrospect.) We started teaching in-house courses when Pat Wiegand asked us to do one at Union Carbide in 1996. A string of those followed and soon we were doing workshops at conferences. And then another of our principles kicked in:
Let’s do something, even if it’s wrong.
Neal Teaching Regression at EigenU
Entrepreneurs know this one well. You can never be sure that any investment you make in time or dollars is actually going to work. You just have to try it and see. So we branched out into doing courses at open sites with the first at Illinois Institute of Technology (IIT) in 1998, thanks for the help Ali Çinar! Open courses at other sites followed. Eigenvector University debuted at the Washington Athletic Club in Seattle in 2006. We’re planning the 15th Annual EigenU for this spring. The 10th Annual EigenU Europe will be in France in October and our third Basic Chemometrics PLUS in Tokyo in February. I’ve long ago lost count of the number of courses we’ve presented but it has to be well north of 200.
Our first technical staff member, Jeremy M. Shaver, joined us in 2001 and guided our software development for over 14 years. Our collaborations with Rasmus Bro started the next year in 2002 and continue today. Initially focused on multi-way methods, Rasmus has had a major impact on our software from numerical underpinnings to usability. Our Chemometrics without Equations collaboration with Donald Dahlberg started in 2002 and has been taught at EAS for 18 consecutive years now.
So what’s next? The short answer: more of the same! It’s both a blessing and a curse that the list of additions and improvements that we’d like to make to our software is never ending. We’ll work on that while we continue to provide the outstanding level of support our users have come to expect. Our training efforts will continue with our live courses but we also plan more training via webinar and in other venues. And of course we’re still doing consulting work and look forward to new and interesting projects in 2020.
In closing, we’d like to thank all the great people that we’ve worked with these 25 years. This includes our staff members past and present, our consulting clients, academic colleagues, technology partners, short course students and especially the many thousands of users of our PLS_Toolbox software, its Solo derivatives and add-ons. We’ve had a blast and we look forward to continuing to serve our clients in the new decade!
When I was a kid I was lucky to have a great sandbox. The fact that it was on the shores of Lake Chelan made it especially nice, but there were a lot of other things that made it great. First off, it was big enough for several kids to play in at the same time. It was well equipped: There were buckets and shovels and rakes. And Tonka trucks. And later Matchbox cars. There was a handy supply of water (the lake) and plenty of “building materials” besides sand, including clay that we’d get on the beach, driftwood, sandstone bricks (left over from the construction of my parent’s house) and a few random pieces of pipe. Every couple years Dad would get out the orchard tractor and scoop up a fresh load of beach sand and put it in the sandbox. Each birthday produced new toys.
Of course the neighborhood kids (about half of whom were my cousins) were all welcome. And it didn’t matter if they wanted to build their own castles or roads or use the toy cars and trucks with the ones built by others. Everybody could play however they wanted.
Eigenvector’s software suite is a lot like that sandbox. There are a lot of tools (toys)! We’ve got everything from PCA to XGBoost. If you’re the builder type then you can use our PLS_Toolbox and MIA_Toolbox with MATLAB® to script and automate analyses and build new tools. Or you can use our existing highly refined point and click interfaces for a wide array of analyses. Have colleagues that don’t have access to MATLAB? That’s OK, they can use our Solo and Solo+MIA stand-alone software and share data and models. And they can all choose what operating system they like, Windows, MacOS or Linux.
Unlike the sandbox, where it was typically hard to take one of your creations home with you, our Solo_Predictor and Model_Exporter tools allow users to apply their models outside the sandbox in a wide variety of ways.
So everybody, or as we like to say around here, EVRIbody is welcome in the Eigenvector sandbox. And our developers will keep providing new toys and make sure that the sand stays fresh!
The MathWorks just released MATLAB® 2019b, and as always, we’re ready! The latest versions of our PLS_Toolbox, MIA_Toolbox and Model_Exporter are all set to go. So whether you are doing PLS calibrations on NIR spectra, curve resolution of pharmaceutical tablets for content uniformity, or SVM models to classify LIBS spectra you can switch to the new MATLAB and keep right on working.
PLS_Toolbox and MIA_Toolbox running in MATLAB 2019b.
As far as compatibility updates go this was a fairly easy one: the changes from MATLAB 2019a to 2019b didn’t break any of our code. We have an extensive suite of test codes that we run to verify this, plus several of us here at Eigenvector always use the MATLAB pre-release in order to identify any issues.
But it’s not always the case that MATLAB upgrades are easy. Sometimes MATLAB updates are significant enough that it requires hundreds of man-hours on our part to make our software work properly. This is especially challenging because we try to maintain a five year window of compatibility with MATLAB versions. The current version of PLS_Toolbox (8.7.1) is compatible with MATLAB 2014a through 2019b.
What does that mean for users? It means that you can stick with your favorite version of MATLAB over the last five years (yes, we all have our favorite versions!) and still use our latest tools. And when you’re ready to switch to the newest version of MATLAB, that will work too! And of course you can also choose between operating systems as we support Windows, MacOS and Linux.
Happy Computing!
BMW
MATLAB is a registered trademark of The MathWorks, Inc.
I attended Metabolomics 2019 and was pleased to find a rapidly expanding discipline populated with very enthusiastic researchers. Applications ranged from developing plants with increased levels of nutrients to understanding cancer metabolism.
Metabolomics experiments, however, produce extremely large and complex data sets. Consequently, the ultimate success of any experiment in metabolomics hinges on the software used to analyze the data. It was not surprising to find that multivariate analysis methods were front and center in many of the presentations and posters.
At the conference I saw some nice examples using our software, but of course not as many as I would have liked. So when I got home I put together this table comparing our PLS_Toolbox and Solo software with SIMCA, R and Python for use with metabolomics data sets.
Compare Software for Metabolomics
PLS_Toolbox
Solo
SIMCA
R/Python
Available for Windows,
Mac and Linux?
Yes
Yes
Windows only
Yes
Comes with User Support?
Yes
Yes
Yes
No
Point-and-click GUIs for
all important analyses?
Yes
Yes
Yes
No
Command Line available?
Yes
No
No
Yes, use is mandatory
Source Code available
for review?
Yes
Yes, same
code as
PLS_Toolbox
No
Yes
Includes PCA, PLS, O-PLS?
Yes
Yes
Yes
Add-ons available
Includes ASCA, MLSCA?
Yes
Yes
No
Add-ons available
Includes SVMs, ANNs
and XGBoost?
Yes
Yes
No
Add-ons available
Includes PARAFAC,
PARAFAC2, N-PLS?
Yes
Yes
No
Add-ons available
Includes Curve Resolution Methods?
Yes
Yes
No
Add-ons available
Extensible?
Yes
No
Yes, through Python
Yes
Instrument standardization,
calibration transfer tools?
Yes
Yes
No
No
Comes complete?
Yes
Yes
Yes
No
Easy to install?
Yes
Yes
Yes
No
Cost
Inexpensive
Moderate
Expensive
Free
So it’s easy to see that PLS_Toolbox is in the sweet spot with regards to metabolomics software. Yes, it requires MATLAB, but MATLAB has over 3 millions users and is licensed by over 5000 universities world wide. And if you don’t care to use a command line Solo includes all the tools in PLS_Toolbox and doesn’t require MATLAB. Plus, Solo and PLS_Toolbox share the same model and data formats. So you can have people in your organization that use only GUIs work seamlessly with people who prefer access to the command line.
So the bottom line here is:
If you are just getting started with metabolomics data PLS_Toolbox and Solo are easy to install, include all the analysis tools you’ll need in easy to use GUIs, are transparent and are relatively inexpensive.
If you are using SIMCA, you should try out PLS_Toolbox because it includes many methods that SIMCA doesn’t have, the source code is available, its more easily extensible, works on all platforms, and it will save you money.
If you are using R or Python, you should consider PLS_Toolbox because it is fully supported by our staff, has all the important tools in one place, sophisticated GUIs, and is easy to install.
Ready to try PLS_Toolbox or Solo? Start by creating an account and you’ll have access to free fully functional demos. Questions? Write to me!
The Lake Chelan Valley Scholarship Foundation awarded 16 scholarships to college bound students from Manson and Chelan in a ceremony at Chelan’s Riverwalk Park on August 17. The awardees included four recent Chelan graduates; Sarah Brownfield, Jasmin Negrete, Owen Oules and Quinn Stamps; four recent Manson graduates; Bryan Bernardo, Megan Clausen, Santiago Santana Gonzales and Tyler Charlton; and eight renewing college students; Neil Carleton, Benjamin Charlton, Ahimelec Diaz, Malena Evig, Anabeth Morales Garcia, Henry Elsner, Addie Ivory and Jessica Oules.
This year’s winners will receive $2500 with the exception of one half-year award for $1250 for a total $38,750.
LCVSF also administers the Dick Slaugenhaupt Outstanding Junior Award, which is selected by the Chelan High School Students and Faculty. College bound CHS graduate Mario Gonzales received $1000.
LCVSF 2019 Scholarship Recipients and Board Members
Regarding this year’s scholarship applications, LCVSF Chairperson Betsy Kronschnabel noted “We continue to see bright young college bound applicants in the Chelan/Manson Valley who need a financial boost to get to school. We carefully evaluate each application based on need, grades and goals, as is the mission of the LCVSF.” Board member Barry Wise added “We’re pleased to be a part of the process that helps support these ambitious young people.”
The LCVSF was made possible by Doug and Eva Dewar, who wished that their estates be used to help the children of the Chelan Valley. LCVSF was founded in 1991, and in that year five scholarships in the amount of $1000 each were awarded. The fund has grown substantially over the years from contributions from many people, but especially significant contributions from John Gladney, Ray Bumgardner, Don & Betty Schmitten, Marion McFadden, Virginia Husted, the Dick Slaugenhaupt Memorial and Irma Keeney. Since 2004, LCVSF has awarded over $575,000 to Chelan Valley students.
LCVSF accepts applications from residents of the Chelan valley for undergraduate education. The awards are renewable for up to four years. LCVSF welcomes applications from graduating high school seniors as well as current college students and adults returning to school.
The LCVSF board includes Betsy Kronschnabel (President), Arthur Campbell, III, Linda Mayer (Secretary), Sue Clouse, Barry M. Wise, Ph.D. and John Pleyte, M.D. (Treasurer). For further information, please contact Barry Wise at bmw@eigenvector.com.
Eigenvector Reseach President and PLS_Toolbox creator Dr. Barry M. Wise was recognized for his achievements in the field of chemometrics* at the 16th Scandinavian Symposium on Chemometrics (SSC16) in Oslo, Norway. Wise received the Herman Wold Medal in gold “For his pioneering contributions in Process Chemometrics and his extensive, deep commitment to the proliferation of Chemometrics.” The award is sponsored by the Chemometrics Division of the Swedish Chemical Society and was presented by previous Wold medal winner Dr. Johan Trygg of Sartorius, AG.
Dr. Wise is the first American and the first non-academic to receive the Wold medal.
Wise gladly accepted the award and thanked the award committee and his fellow Eigenvectorians past and present for their support. He also acknowledged his good fortune and gratitude for guidance provided by his graduate advisor in Chemical Engineering Prof. N. Lawrence (Larry) Ricker and Prof. Bruce R. Kowalski in Chemistry.
Dr. Johan Trygg (left) presents Dr. Barry Wise with the 14th Herman Wold Medal in gold at the 16th Scandinavian Symposium on Chemometris in Oslo, June 18, 2019.
The text presented at the award ceremony is included below.
Motivation for Barry M. Wise
Dr. Wise has a PhD in Chemical Engineering from University of Washington, Seattle, USA and has been active in chemometrics since 1985. His research has focused largely on chemometrics in chemical process analysis, monitoring and control. E.g. he demonstrated PCA and PLS for monitoring systems to detect process upsets and failed sensors and introduced the term MSPC (multivariate statistical process control). His scientific contribution with 99 publications (H=33) cited 3700 times is impressive for a non-academic.
To proliferate and cultivate Chemometrics, scientists and engineers must be made aware of the benefits of the methods, have access to the methods and educated on the proper use. Barry has successfully achieved all three goals. Dr. Wise has through Eigenvector Research, Inc., using PLS_Toolbox/Solo taught thousands of students how to apply chemometrics, from basic methods like PCA and PLS to advanced non-linear methods like ANNs and SVMs. There are thousands of chemometrics users, from novices to experts, who have been impacted by the efforts of Barry Wise. These people span myriad industries from semiconductor, pharmaceutical, petrochemical, manufacturing, medical devices, agriculture, food and beverage and automotive areas.
Barry has also received numerous awards and honorary recognitions including Chemometrics award at Eastern Analytical Symposium (EAS, 2001) and Chair of GRC on Statistics in Chemistry and Chemical Engineering (1995), and CAC-2002.
Barry has dedicated 30 years to the chemometric community and is a true role model for students and scientists in sharing knowledge and through polite and logical argumentation in scientific discussion.
In conclusion, we find that Barry Wise is a most worthy receiver of the Herman Wold gold medal of 2019.
*Chemometrics is the chemical discipline that uses mathematical, statistical, and other methods employing formal logic to design or select optimal measurement procedures and experiments, and to provide maximum relevant chemical information by analyzing chemical data.
When I was first introduced to image analysis by Paul Geladi he referred to it as Multivariate Image Analysis or MIA. So when we released our image analysis package back in 2005 we called it MIA_Toolbox. Since then more and more analytical techniques have been adapted to produce images. On the micro scale this includes surface analysis techniques such as SIMS and Raman microscopy. On the macro scale it includes remote sensing such as infrared imaging. Based on the expansion of the number of channels in the spectroscopic dimension it’s become more common to refer to the data as Hyperspectral images. Regardless of what you call it, MIA_Toolbox was built to handle it!
Early versions of MIA_Toolbox brought the conventional chemometric tools and preprocessing methods found in PLS_Toolbox to images, focusing the analysis more on the relationship between variables (typically wavelengths or mass) then on the relationship between pixels. In this it was a departure from conventional image analysis which focuses on the latter. For instance Principal Components Analysis (PCA) could be applied to images, where the neighboring pixels are treated independently, and then the results could be displayed back in the image plane as image score plots, such as the one below.
Of course many other methods could be applied in a similar fashion, including Multivariate Curve Resolution (MCR) and of course classification methods such as SIMCA and PLS-DA. It was also easy to do regression provided that reference values corresponding to the image were available.
Since then we’ve added techniques that add spacial information such as Maximal Autocorrelation Factors (MAF) which finds factors that are more highly correlated in the image plane. An example of MAF is show below and it can be seen that it creates images with more contiguous areas than the PCA image shown above.
Texture is a measure of the spacial variations of intensity in an image. This property can be important in the quality of some manufactured surfaces, in crystallization processes, or to assess the homogeneity of pharmaceutical tablets. Since 2009 MIA_Toolbox has included texture filters which can be used to create spectra from images that capture the spatial variation.
With the integration of the ImageJ image processing library into MIA_Toolbox in 2011 particle analysis was enabled. It is now possible to create a workflow where hyperspectral images of particles are treated with PCA to reduce dimensionality, then fed into particle analysis where each particle becomes a sample with measured size and shape characteristics, then those are pushed into a classification method to sort particles based on both their physical and chemical attributes. A screen shot from such a work flow is shown below.
In more recent releases we’ve continued to refine the useability of the tools, adding more file importers, preprocessing methods, etc. And of course all of this is also included in our stand-alone image processing package, Solo+MIA.
Want to learn more about image analysis? We’re doing courses at this year’s EigenU Europe in Montpellier, FRANCE, and the SCIX conference in Palm Springs, USA. At EVRI we’re excited to create tools that address the next generation of chemical analysis.
Chimiométrie 2019 was held in Montpellier, January 30 to February 1. Now in its 20th year the conference attracted over 150 participants. The conference is mostly in French, (which I have been trying to learn for many years now), but also with talks in English. The Scientific and Organizing Committee Presidents were Ludovic Duponchel and J.M. Roger, respectively.
Eigenvector was proud to sponsor this event, and it was fun to have a display table and a chance to talk with some of our software users in France. As usual, I was on the lookout for talks and posters using PLS_Toolbox. I especially enjoyed the talk presented by Alice Croguennoc, Some aspects of SVM Regression: an example for spectroscopic quantitative predictions. The talk provided a nice intro to Support Vectors and good examples of what the various parameters in the method do. Alice used our implementation of SVMs, which adds our preprocessing, cross-validation and point-and-click graphics to the publicly available LIBSVM package. Ms. Croguennoc demonstrated some very nice calibrations on a non-linear spectroscopic problem.
I also found three very nice posters which utilized PLS_Toolbox:
Preliminary appreciation biodegradation of formate and fluorinated ethers by means of Raman spectroscopy coupled with chemometrics by M. Marchetti, M. Offroy, P. Bourson, C. Jobard, P. Branchu, J.F. Durmont, G. Casteran and B. Saintot.
By all accounts the conference was a great success, with many good talks and posters covering a wide range of chemometric topics, a great history of the field by Professor Steven D. Brown, and a delicious and fun Gala dinner at the fabulous Chez Parguel, shown at left. The evening included dancing, and also a song, La Place De la Conférence Chimiométrie, (sung to the tune of Patrick Bruel’s Place des Grands Hommes), written by Sylvie Roussel in celebration of the conference’s 20th year and sung with great gusto by the conferees. Also, the lecture hall on the SupAgro campus was very comfortable!
Congratulations to the conference committees for a great edition of this French tradition, with special thanks to Cécile Fontange and Sylvie Roussel of Ondalys for their organizational efforts. À l’année prochaine!
I logged in to LinkedIn this morning and found a discussion about Python that had a lot of references to PLS_Toolbox in it. The thread was started by one of our long time users, Erik Skibsted who wrote:
“MATLAB and PLS_Toolbox has always been my preferred tools for data science, but now I have started to play a little with Python (and finalised my first on-line course on Data Camp). At Novo Nordisk we have also seen a lot of small data science initiatives last year where people are using Python and I expect that a lot more of my colleagues will start coding small and big data science projects in 2019. It is pretty impressive what you can do now with this open source software and different libraries. And I believe Python will be very important in the journey towards a general use of machine learning and AI in our company.”
This post prompted well over 20 responses. As creator of PLS_Toolbox I thought I should jump in on the discussion!
In his response, Matej Horvat noted that Python and other open source initiatives were great “if you have the required coding skills.” This a key phrase. PLS_Toolbox doesn’t require any coding skills _at all_. You can use it entirely in point-and-click mode and still get to 90% of what it has to offer. (This makes it the equivalent of using our stand-alone product Solo.) When you are working with PLS_Toolbox interfaces it looks like the first figure below.
Of course if you are a coder you can take advantage of the ability to also use it in command line mode and build it into your own scripts and functions, just like you would do with other MATLAB toolboxes. The caveat is that you can’t redistribute it without an additional license from us. (We do sell these of course, contact me if you are interested.) When you are working with Python, (or developing MATLAB scripts incorporating PLS_Toolbox functions for that matter), it looks like the second figure.
Like Python, PLS_Toolbox is “open source” in the sense that you can actually see the code. We’re not hiding anything proprietary in it. You can find out exactly how it works. You can also modify if you wish, just don’t ask for help once you do that!
Unlike typical open source projects, with PLS_Toolbox you also get user support. If something doesn’t work we’re there to fix it. Our helpdesk has a great reputation for prompt responses that are actually helpful. That’s because the help comes from the people that actually developed the software.
Another reason to use PLS_Toolbox is that we have implemented a very wide array of methods and put them into the same framework so that they can be evaluated in a consistent way. For instance, we have PLS-DA, SVM-C, and now XGBoost all in the same interface that use the exact same preprocessing and are all cross-validated and validated in the same exact way so that they can be compared directly.
If you want to be able to freely distribute the models you generate with PLS_Toolbox we have have a tool for that: Model_Exporter. Model_Exporter allows users to export the majority of our models as code that you can compile into other languages, including direct export of Python code. You can then run the models anywhere you like, such as for making online predictions in a control system or with handheld spectrometers such as ThermoFisher’s Truscan. Another route to online predictions is using our stand-alone Solo_Predictor which can run any PLS_Toolbox/Solo model and communicates using a number of popular protocols.
PLS_Toolbox is just one piece of the complete chemometrics solutions we provide. We offer training at our renowned Eigenvector University and many other venues such as the upcoming course in Tokyo, EigenU Online, and an extensive array of help videos. And if that isn’t enough we also offer consulting services to help you develop and implement new instruments and applications.
So before you spend a lot of valuable time developing applications in Python, make sure you’re not just recreating tools that already exist at Eigenvector!
The Lake Chelan Valley Scholarship Foundation (LCVSF) will award 18 scholarships in the amount of $2500 each. The recipients include three members of Chelan’s Class of 2018: Ahimelec Diaz, Gage Martin and Madeline Peebles; six members of Manson’s Class of 2018: Alyssa La Mar, Veronica Lulo, Jessica Medina, Joe Strecker, Adeleine Torgesen and Magali Vargas; and nine renewals by previous recipients: Neil Carleton, Henry Elsner, Melena Evig, Delacy Machus, Anabeth Morales, Jessica Oules, Abbigail Phelps, Zachary Phelps and Bethany Trusel.
“The nine successful renewals show that most of our previous recipients are doing well in college, and we’re certainly glad to see that” noted LCVSF board member Barry Wise. “This year’s batch of new awards included several students with quite compelling stories of overcoming hardships to get through high school and successfully apply for college. It is always humbling to read these. We’re happy to help make it just a bit easier for these kids to continue in school.”
The awards will be presented at Riverwalk Park on Saturday, August 11 at 10am. Please join us to celebrate the achievements of these young scholars.
The LCVSF was made possible by Doug and Eva Dewar, who wished that their estates be used to help the children of the Chelan Valley. LCVSF was founded in 1991, and in that year five scholarships in the amount of $1000 each were awarded. The fund has grown substantially over the years from contributions from many people, but especially significant contributions from John Gladney, Ray Bumgardner, Don & Betty Schmitten, Marian McFadden, Virginia Husted, the Dick Slaugenhaupt Memorial and Irma Keeney. This year’s scholarships total $45,000.
LCVSF accepts applications from residents of the Chelan valley for undergraduate education. The awards are renewable for up to four years. LCVSF welcomes applications from graduating high school seniors as well as current college students and adults returning to school.
The LCVSF board includes Betsy Kronschnabel (President), Arthur Campbell, III, Linda Mayer (Secretary), Sue Clouse, Barry M. Wise, Ph.D. and John Pleyte, M.D. (Treasurer). For further information, please contact Barry Wise at bmw@eigenvector.com.
The 13th Annual Eigenvector University was held April 29-May 4 in Seattle. It was a busy, vibrant week with 40 students with a wide variety of backgrounds attending along with 10 instructors. Users of our PLS_Toolbox and Solo chemometrics packages showed some of their recent results at the Wednesday evening poster session, which has become an EigenU tradition. Now combined with our PowerUser Tips & Tricks session, it makes for a full evening of scientific and technical exchange fueled by hors d’oeuvres and adult beverages.
This year’s best poster, (as judged by the EVRI staff), was “Nondestructive Analysis of Historic Photographs” by Arthur McClelland, Elena Bulat, Melissa Banta, Erin Murphy, and Brenda Bernier. The poster described how Specular Reflection FTIR was used with Principal Components Analysis (PCA) to discriminate between coatings applied to prints in the Harvard class albums from 1853-1864.
For his efforts Arthur took home a pair of Bose Soundsport Wireless Headphones. Arthur is shown above accepting his prize from Eigenvector President Barry M. Wise and Vice-president Neal B. Gallagher. Congratulations Arthur!
The runner up poster was “Analytical Approach to Investigate Salt Disproportionation in Tablet Matrices by Stimulated Raman Scattering Microscopy” by Benjamin Figueroa, Tai Nguyen, Yongchao Su, Wei Xu, Tim Rhodes, Matt Lamm, and Dan Fu. The poster demonstrates how the the conversion of Active Pharmaceutical Ingredient (API) from its active salt form to its inactive free base form can be quantified in Raman images of tablets. Benjamin received a Bose Soundlink Bluetooth Speaker for his contribution. Kudos Benjamin!
We were also pleased to have several other very interesting poster submissions, as shown below:
Yulan Hernandez, Lesly Lagos and Betty C. Galarreta, “Selective and Efficient Mycotoxin Detection with Nanoaptasensors using SERS and Multivariate Analysis.”
Devanand Luthria and James Harnly, “Applications of Spectral Fingerprinting and Multivariate Analysis in Agricultural Sciences.”
Thanks to all EigenU 2018 poster presenters for a fun and informative evening!
Integration of Eigenvector’s multivariate analysis software with Metrohm’s Vis-NIR analyzers will give users access to advanced calibration and classification methods.
Metrohm’s spectroscopy software Vision Air 2.0 supports prediction models created in EVRI’s PLS_Toolbox and Solo software and offers convenient export and import functionality to enable measurement execution and sample analysis in Metrohm’s Vision Air software. Customers will benefit from data transfer between PLS_Toolbox/Solo and Vision Air and will enjoy a seamless experience when managing models and using Metrohm’s NIR laboratory instruments. Metrohm has integrated Eigenvector’s prediction engine, Solo_Predictor, so that users can apply any model created in PLS_Toolbox/Solo.
Data scientists, researchers and process engineers in a wide variety of industries that already use or would like to use Eigenvector software will find this solution appealing. PLS_Toolbox and Solo’s intuitive interface and advanced visualization tools make calibration, classification and validation model building a straightforward process. A wide array of model types, preprocessing methods and the ability to create more complex model forms, such as hierarchical models with conditional branches, make Eigenvector software the preferred solution for many.
“This a win-win for users of Metrohm NIR instruments and users of Eigenvector chemometrics software” says Eigenvector President Dr. Barry M. Wise. “Thousands of users of EVRI software will be able to make models for use on Metrohm NIR instruments in their preferred environment. And users of Metrohm NIR instruments will have access to more advanced data modeling techniques.”
Researchers benefit from Metrohm’s Vis-NIR Instrument and Vision Air software through instruments covering the visible and NIR wavelength range, intuitive operation, state-of-the art user management with strict SOPs and global networking capabilities. Combining the solutions will create an integrated experience that will save time, improve product development process and provide better control of product quality.
Key Advantages PLS_Toolbox/Solo:
Integration of Solo_Predictor allows users to run any model developed in PLS_Toolbox/Solo
Allows users to make calibration and classification models in PLS_Toolbox and Solo’s user-friendly modeling environment
Supports standard model types (PCA, PLS, PLS-DA, etc.) with wide array of data preprocessing methods
Advanced models (SVMs, ANNs, etc.) and hierarchical models also supported
Key Advantages Vision Air:
Intuitive workflow due to appealing and smart software concept with specific working interfaces for routine users, and lab managers
Database approach for secure data handling and easy data management
Powerful network option with global networking possibility and one-click instruments maintenance
We're transitioning to a new login system. You will be required to reset your password the first time you log into the new system. Please contact us at helpdesk@eigenvector.com if you have any trouble accessing your account.
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.OkNo