Friday, August 30, 2013

Be glad you're not our students (this week)

This was my contribution to the "First Day of School" photo meme perpetuating on Facebook for several week as children went back to school. You can clearly see how happy I am for the start of another school year. It has nothing to do with how pleased I am with my clever attempt at humor. I teach general chemistry this term, and our first class was Thursday. WPI has a unique (wacky) academic calendar. It's not quarters and it's not semesters. We have 4 terms between the end of August and the end of April, which means our classes are intense experiences packed into 7 weeks. Since we have such a short term, we have to hit the ground running and students have to work hard so they don't fall behind. Missing a week of class at WPI is equivalent to missing 2 weeks at other schools. All this makes the following situation all the more frustrating.

At some point between when Socrates was walking around the agora bugging customers with questions and 2013, a bunch of professors looked around and said: "Do you like dealing with the logistics of running an educational institution? Me neither. We should really hire some people to take care of the administrative tasks so we can focus on the teaching and research that are the reason we started doing this in the first place." Thus began the slow decent to the system we have today.

General chemistry at WPI is divided into the lecture and lab. Although the grades are combined at the end of the term, both run independently; however, students only register for the lab. We have 3 lectures running this term and each lecture has six labs assigned to it, but since there is no crosstalk between lab and lecture, this is an arbitrary linkage. Each lab section is limited to 24 students for safety and space reasons, so once a section is full that's it. Compound the space problem with the largest incoming class in the history of the institute. Already faced with the universal difficulties of limited class sizes, the enrollment procedures seem to be geared toward hazing incoming students and punishing professors (for what crime, it is not entirely clear, but we must have done something to someone).

This disconnected-connected lab-lecture creates registration conflicts if a student wants to register for a lab section, but attend a different lecture when the preassigned lecture conflicts with another class. So starting off with a well oiled machine already. Students register for classes online (of course) as well as wait lists if the desired section is full. Students can also sign up for as many wait lists as they want however. I can only see the 6 wait lists of labs linked to my lecture, so a student could be on multiple wait lists assigned to either of the other 2 lecture sections, and I won't know. This wouldn't be that bad, except the registrar locks enrollment a few days before classes, a lock that lasts through the first several days of classes (with our short academic term, the drop/add period is 1 week to begin with). Students can still sign up for wait lists during this period. So if someone drops the class (e.g. tests out of that term of chemistry), no one from the wait list is automatically offered the opening until after the lock is lifted. So there are stressed out incoming students worried about getting into a class, none of whom are familiar with the procedures, tricks of the trade or what they can expect when wait listed.

The only way a student can add a class during the lockout period is with a drop/add form signed by the instructor, which is then delivered to the registrar. Although I can view the class roster and wait lists online, I cannot select someone from the wait list and enroll them automatically or request online that the registrar do it. The student must physically carry a piece of paper across campus. The general chemistry instructor has a harder task than most, because great care must be taken not to break the 24 student lab section limit. So this requires sitting at a computer looking at locked class rosters (6 different ones and only one can be viewed at a time), unlocked wait lists and no knowledge of what's going on with the other 2 lectures and 12 lab sections. Did I mention the students were stressed out? In going through the process of emailing students to tell them I could sign a drop/add form for them, a student asked "What form?" Not a surprising question from someone who has been on campus less that a week. Showing more patience than usual, I went to the registrar's website to find the form, which was nowhere to be found. I talked to one senior colleague, who also immediately went to the registrar's website to search. Finally, another colleague informed me that the drop/add forms were only available in paper from in the registrar's office. That makes perfect sense in the 21st century. Did I mention the students were stressed out? 

So, it's day 2 of the term. I have 6 unassigned seats in my lab sections. I have a dozen or so people signed up on different wait lists, but very little idea if these students are still really waiting or just artifacts left after a student signed up for another section. Did I mention the students were stressed out? At some point, the people who were hired to facilitate university business managed to outsource their job back to the professors, which is quite a coup. Yesterday, I phoned the registrar's office 3 times over the period of an hour in the middle of the afternoon. The call went to voicemail every time.

Wednesday, August 28, 2013

How much should it cost to study chemistry?

President Obama recently outlined his plan to address the high cost of a college education. Praise and criticism immediately followed, but it's undeniable the price tag for a college education has risen faster than the rate of inflation. It's actually an interesting exercise to play around with the scorecard tool at whitehouse.gov. My take home was that most prestigious/high profile universities fared well, as did most of the flagship state universities. I was taken aback at the low graduation rate (< 50%) for a lot of schools, including some that are relatively expensive. Some of my thoughts on this general problem include:
  • The increasing number of administrators with the associated cost passed along to students: can the genie be put back in the bottle?
  • While not responsible for tuition inflation (at least it shouldn't), schools are increasingly competing using amenities. New dormitories have suites outfitted with kitchens, single rooms, air conditioning, etc. Obviously these increase the quality of student life, but it's also more expensive than 4 walls, a roommate and a bed frame with an institutional twin mattress. It's just going to cost more.
  • How can public universities become more affordable in a political climate where the trend is to cut state funding for higher education? States increasingly are ignoring the ROI of subsidizing residents' education (i.e. a short-term vs. a long-term mindset), and the shortfall has to come from somewhere. The student's pocket is the only logical option given the structure of the American higher education system.
  • The scorecard graduation numbers (dropout rates) don't always account for students who transferred. This is a clear weakness in the data.
  • The major gap in the data denounced by the President was the employment record for graduates. While I agree such information would be valuable for prospective students, has anyone tried to track where students go? I have a hard time keeping up with my former group members, and both departments I worked in have tried to collect data on former majors and Ph.D. students, but it's a Herculean task. Extrapolate that to an entire school. I doubt many alumni associations would be confident that their data is accurate or complete.
  • The President's underlying message seemed to be that government support would be tied to student performance (graduation, employment, etc). While an abysmal graduation rate may point toward issues, it could also indicate schools giving students a chance to explore college (especially those who lacked the academic record to be accepted elsewhere). Many students discover that college isn't the right choice for them, at least for now, and pursue other options. Should a college be penalized for providing an opportunity? This would seem to direct public support toward students who already have a high probability of getting a degree, but do nothing to help students on the margins of higher education. This is related to a point others have made that such a metric could incentivize retention of students who are ill-equipped (skills or motivation) for college success who might otherwise choose different careers.
The above thoughts as well as the commentary from others on the same issue, led me to do some superficial evaluation of how I contribute to the cost of a chemistry degree to students. Chemistry and other laboratory-intensive majors can stick students with hidden fees, but as far as I can ascertain my current employer doesn't, which is fair since the tuition/year currently sits at a lofty $42,000. Conversely, I'm pretty sure UConn did charge fees for laboratories, but the tuition was less than half that of WPI. So far so good.

So what is required for my classes? For labs, a notebook and some safety goggles. Both items are affordable, and notebooks may have been used in previous labs and should still have space for future use. For a lecture class, students are "required" to buy a textbook. I have been guilty of being unsympathetic to students complaining about the cost of textbooks in the past. Textbooks were expensive when I went to college, and we complained too. Why should today's students miss out on experiencing the collective suffering that binds classmates together? Buying textbooks was worth hours round table moaning and groaning before the stress of the semester began. It was a red badge of honor to have the most expensive required textbook.

The only problem is that the inflation of textbook prices has outpaced the inflation of just about everything (see graph in the linked article). I've become aware of the magnitude of the problem recently. I teach one of several general chemistry sections, and the textbook we  use recently came out with a new edition. The publisher essentially forced us to adopt the new edition since they can't (won't) provide the bookstore with the previous one. The publisher doesn't profit from the secondary market for used textbooks, and as a business they are obviously motivated to sell as many textbooks as possible. A side-by-side comparison of the two editions shows very few differences though . The problems at the end of the chapter have been changed and the cover is a different color. The latter is a minor aesthetic change, but the former makes the earlier edition unusable if an instructor plans to assign practice problems. Clearly there are workarounds for these changes, but the inconvenience factor favors the publisher in the long run. 

From a content standpoint, any general chemistry instructor could probably assign reading and teach from a textbook printed +30 years ago. What differences would there be? Today's periodic table includes more transuranium elements and some of the sidebars that bring in recent examples of chemistry applications would be out of date when using an old textbook, but the basic concepts haven't changed. There are very few new textbooks or revisions that propose seismic changes to the accepted general chemistry teaching dogma. So students are paying hundreds of dollars to get content available in a used bookstore or from the discarded bookstacks of a retiring professor. Certainly publishers should be able to make a profit, but why create a system that depends on exploiting students with inflated prices?

This year, our general chemistry instructors adopted an online homework system. Students are charged for this service as well, which can add up to another $100* depending on how many terms of the sequence they are taking. In the absence of a required textbook, online homework provides a solution to being held hostage by publishers, but if the textbook remains a requirement, it adds another straw to the camel's back of college costs. 

*I need to verify this

Friday, August 23, 2013

VAPs: Ladder or Dead End?

I began thinking more about visiting assistant professor positions when one of my postdocs unexpectedly left this week to take such a job.

A few years ago, the NSF formalized mentoring postdoctoral researchers, which requires a statement to accompany all grants requesting support for postdoctoral researchers. Even without such a requirement, helping our students and postdocs reach their career goals is perhaps the most important job of a PI. In mentoring postdocs as well as students, I've routinely discouraged applying for or accepting temporary teaching positions. These positions have many titles: visiting assistant professor, assistant professor in residence, adjuncts, etc., but all share the common feature of short-term (and possibly terminal) contracts. Certainly, these positions fill a desperate need for colleges with instructor shortages, but what does this do to the career trajectory of the postdoctoral researcher? Unfortunately, the evidence appears to be largely anecdotal. Both Chemjobber and Andre the Chemist have written on this, and the Chronicle of Higher Education recently recounted the famous cautionary tale of Stan Nadel. I haven't found much hard data or attempts to distinguish between the sciences and humanities. There is also the related issue of whether colleges are using these positions as a cheap substitute instead of hiring tenure track professors or full-time instructors, but that is a topic for another day.

All three postdocs who have entered the workforce from my lab have taken positions as VAPs, although not via the same pathway. One followed the typical pathway of the postdoctoral researcher, working hard and producing results for 12-18 months before applying for industrial jobs. This chemist is immensely talented, one of the best I've worked with. While it oversimplifies the situation somewhat, the perfect storm of the economic crisis and concomitant downsizing by the pharmaceutical industry led to a drawn out job search that ultimately ended in frustration. For personal reasons, he finally relented and took a VAP position. Fortunately, his excellence as an instructor eventually was recognized by a change in title to a more stable position, albeit not on the tenure track. The latter 2 postodocs aspire to careers in academia. One directly from the lab, the other after spending time in industry following a previous unsuccessful tenure track job search. It remains to be seen what will happen when they come to the end of the temporary contracts.

My warnings, founded only on anecdotal information, to postdocs are as follows:

1. If you aspire to work in industry, a temporary teaching position does more harm than good. Even if you are offered the opportunity to participate in research while teaching, the workload of a VAP will be prohibitive. Justified or not (and likely not), a gap in research record will either be viewed as a lack of commitment or talent. Someone coming directly out of a research intensive position will be deemed to have more current skills and knowledge of the research world. Also, it will be assumed that you were unsuccessful in previous job searches, and thus were judged already as being inadequate. There may be no defensible justification for this assessment, but don't expect prospective employers to factor in extenuating circumstances.

2. A temporary position is not a high percentage route to the tenure track, especially at a research school (even at an undergraduate institution). The same argument from above applies, candidates currently involved in research are more current in their knowledge and have not been passed over already. While teaching is an important skill for tenure track professors, all candidates have experience as TAs and search committees assume that on the job training is sufficient to become an effective instructor. Experience teaching as a VAP will not be weighed heavily when comparing candidates.

3. At best, a VAP or similar position could lead to a semi-permanent teaching job (i.e. longer-term contract) if you have an excellent record in the classroom. If your career ambition is to be a lecturer or teach at a community college, then this is a reasonable plan; however, I know very few people who aspire to this because the salaries are usually lower and the workload is high.

In a tight job market, it is hard to condemn postdocs from taking VAPs since a job is better than unemployment. If an individual aspires to a tenure track or industrial position however, I recommend remaining a postdoctoral researcher as long as it is feasible/tolerable. Even a second postdoc may be a better stopgap measure than a VAP. Even though there appear to be problems aplenty with VAPs, other job seekers may find them less problematic. Retirees interested in a second career are likely less sensitive to long-term job prospects, uncertainty and compensation issues associated with VAPs. I know several people in this category, and all bring a unique perspective to teaching that has added value for their students. Laid off workers from industry may have similar problems to postdocs breaking out of academia once they have taken VAPs, however. This is a catch-22 in reconciling present realities with future opportunities.

While this is my advice, I would be much more comfortable dispensing it if I could cite solid employment data.

Thursday, August 22, 2013

Questioning the Sierra Magazine "Coolest School" Rankings

Everyone loves a list. Put something in the form of a top 10, 50 or 100 and it guarantees to drive traffic to an article. This was certainly true of Sierra Magazine's "Coolest School" list, which was picked up by numerous outlets including list & slideshow reposter extraordinaire the Huffington Post. While I certainly have no love lost for my former employer, this is a criticism of the methodology to generate this list, rather than of the institution that happened to be ranked #1. The Sierra article cites a number of laudable initiatives, but UConn doesn't even crack the top 22 honor roll on the same topic by the Princeton Review. So clearly someone's research methodology is flawed. Here is Sierra Magazine's methodology. The names  on document that was self-reported to the magazine appear to be current students involved in Eco-Husky.

If what I've observed at UConn is tops in the nation, everyone is doing a pretty lousy job making colleges greenier. For example, UConn has grown with a blatant disregard for water resources. UConn has grown its student body in recent years and the downtown development puts further strain on the water resources. Until a pipeline can be installed to transport in water from off-site, one can anticipate continuing to drain the Fenton River beyond its breaking point, especially during dry periods. There are questions about whether or not the current plan is adequate anyway.

When I was there, I frequently bemoaned the use of recycling bins as trash cans and vice versa, although I have yet to be on a campus where this wasn't a problem. I have great enthusiasm for UConn's Think Outside the Bottle campaign, as the negative impacts of bottle water are well-documented; however, the university catering service still lists "UConn Natural Spring Water" as a product so I assume the university is still selling branded bottled water on campus. Despite being a large campus spread out over 3000 acres, I was always struck by the lack of bicycles as a mode of transportation even with several off-campus housing facilities within a couple miles. The Sierra Magazine article mentions bicycle shares, but unless they've drastically increased the number of racks on campus (and if they did, they've hidden them really well), I can't blame students for being reluctant to commute to class by bicycle. The Storrs Center development has, and will further, increase traffic with the apartments and businesses, yet no one saw fit to include bicycle lanes in the plan (bicycle racks are also limited). This does nothing to make the area greener or safer. The Board of Directors includes members of the town and university.

Personal observations seem to contradict the Sierra Magazine reports to some degree. Certainly no campus is, or can be expected to be, 100% green yet. Decreasing the environmental impact of campuses on the surrounding areas is important for sustainability. The question is, do Sierra Magazine's rankings reflect the green reality of those campuses? How accurate are any of these lists that perpetuate via social media?

Sunday, August 18, 2013

EA discussion prompted by Dorta Organometallics paper

Originally, I responded to ChemBark's post on the Dorta paper on Facebook. CEN's Carmen Drahl subsequently requested permission to use parts of this comment in a news article on the situation. To clarify my position, I provided Carmen with a (long-winded) comment on elemental analysis. Here, is more or less that commentary edited to be bloggable.


When was the last time you looked at a paper and went to the experimental or SI desperate to see the elemental analysis? When was the last time you bothered to look at it period? I suspect the answer is seldom or never, unless you were trained during a different era of chemistry. EA was the gold standard up until 30-35 years ago, but is it today? I argue no. Many journals still list EA as a requirement (or option), but it's really inconsistent on whether that standard is enforced. I suspect that EA was deemed a more reliable metric of purity than NMR in the past because impurities <5% wouldn't show. I'm not so sure that is as true with modern spectrometers. While it has its own weaknesses, analytical HPLC is a much better technique for organic compounds. Coordination compounds/organometallics may not be as amenable to HPLC, but a combination of things X-ray/NMR/powder diffraction/mp/etc. individually, or in combination can give an adequate or even better indication of purity. 

I do believe EA has value if someone is reporting an unexpected or unusual result. An example that immediately comes to mind is a putative HBr complex, a porphyrin ligand from Lippard in 1998. In my recent experience, we had a chelating ligand with 5 donor groups. We got exactly what one would expect in the X-ray. The EA was off by a large margin. It was nonsense. It may not have been 100% pure, but it certainly wasn't half garbage. Hence, I suspected incomplete combustion because the company did not routinely use combustion aid, which we always used for perchlorate/metal complexes when I was a student. Fortunately, the HRMS detected the molecular ion, and that deemed a sufficient form of additional characterization. There was nothing unusual or unexpected about this complex, but we needed to meet the journal requirements. 

Back to the original point, of "who looks at EA?" (and perhaps why was it deemed "acceptable" to make up numbers?). I think everyone looks at EA as a major hassle because they have had issues getting data. Of the 5 universities I've been at (CWRU, MIT, Berkeley, UConn and WPI), only Berkeley had on-site EA, which may indirectly indicate how valuable the community finds EA. This means getting analysis required sending out a sample. This becomes even more non-trivial with sensitive compounds. Even if the compounds aren't expected to be sensitive, an off analysis always makes one suspicious of decomp in transit, or a problem with the contracted lab. Then there's sending multiple samples trying to get a good analysis, it can become very tedious. Furthermore, I suspect a large percentage of people who report an EA have used the trick of including solvent in the calculation to make the analysis hit. I had to do this with my first sample at MIT. We must have analyzed dozens of crystals from many batches of one complex by X-ray (it made beautiful uniform, individual xtals). It was always the same thing. We thoroughly dried the complex, but the analysis wouldn't hit after many attempts. That's when we modeled in the water (fortunately water was in the crystal lattice so this was a reasonable assumption). I suspect others have done the same, and perhaps even added in solvents that weren't seen in the X-ray structure (but were in the synthesis) to make their data fit. If you're essentially massaging the data because it's hard to get what you expect, or you're jumping through so many hoops to get a hit (with what you are 99% sure is a pure complex/compound), it doesn't impart much faith in the technique.

In the end, I prefer alternatives to EA whenever possible. If I were to project a on Dorta, I doubt he's really trying to pull one over on the journal/community with bad data. He's looking at the inconvenience of getting the data, and perhaps has little/no faith in the technique but does have confidence that the complexes/chemistry is correct (i.e. does not believe it warrants the hassle of adding more confirmation). It's not the right way to do science, but somewhat understandable. I think there is an opportunity here to talk about reasonable standards, the review process, as well as scientific integrity. You really can't prevent falsifying data for purity, almost every technique has a workaround  that would not be detectable without close scrutiny or even having to reproduce the author's work.

Wednesday, August 14, 2013

Making science personal (when appropriate)

In a recent Chemistry World blogpost, Philip Ball suggests that scientists should be more proactive in inserting themselves into scientific papers. Ball claims that using a first-person narrative style provides greater clarity in writing as well as an increased individual investment in the science. The enthusiasm for the proposal was evident immediately on social media, but struck a nerve in me about the quality scientific writing generally. There are several caveats to this practice that are not explored in sufficient detail. There is also some ambiguity in the commentary as voice is mixed with narrative style. One does not need to use pronouns to write in an active voice, and passive voice writing can use personal pronouns. Regardless of the first-person narrative issue, an active voice is usually favorable to a passive one in scientific writing.

Having reviewed numerous manuscripts and collaborated on writing both papers and grants, first-person narrative is often used as a stylistic writing crutch, and does very little to achieve the goals outlined by Ball. Training students in scientific writing is an important part of a PI’s job, although this easily can be forgotten in the publish or perish environment of modern academic science. The majority of Ph.D. students will not pursue careers in academia, and they must develop writing skills that reflect professional practices. Journal publications will likely represent the minority of a typical student’s scientific writing, so we need to consider how to use this writing experience as a tool to prepare them for the future.
One of my mantras to students, which relates to first-person style, is economy of words. I commonly encounter practices like:

   “It has been shown that palladium is an effective catalyst for C-C bond formation1
    or
   “Smith has shown that palladium is an effective catalyst for C-C bond formation1

The qualifier “it has been shown” is verbose and should be removed. “Palladium is an effective catalyst for C-C bond formation1” is the useful information. The acknowledgement of an individual’s or group’s contribution to science may have merit, but like the practice of first-person narrative, the impact becomes diluted with overuse. The name inclusion also duplicates information contained in the citation. When reading a review that makes liberal use of names, eventually you will mentally block out the names completely to focus on the scientific content.

Contrast this style to using names for emphasis. Using a name can help denote a particularly impactful discovery or indicate significant contributions to a field by a particular group/individual. In my papers, Roger Tsien and Graham Ellis-Davies are names that appear with some regularity because each can claim to be a pioneer in photocaged complexes, and both have published numerous seminal papers in the field. I am less likely to mention other practitioners by name, nor would I expect them to do so when citing my work.
The use of “we” or “I” can be effective when used appropriately. So when should the first-person narrative be used, and when is it unnecessary (or worse)?

Useful: putting your efforts into context of the field as a whole or the long-term goals of your research program – emphasize where you fit in the picture and what you will contribute.

Borderline: stating “we are interested in…” – close to the previous point, but too specific. You are publishing a paper on the subject, so it’s safe to assume that you’re interested in the topic.

Not useful: any form of “we did this experiment…” – your name is on the paper describing the science anything restating this is redundant.

Not useful: using first person to describe a routine observation or result. Again, this is redundant with the author list.

Borderline: using first person to describe an unusual/unexpected result – the crux of Ball’s assertion is the need a writing style that hedges against individual bias and fallibility. So one must ask the question, to what degree does the data represent a universal truth?

Most useful: providing your explanation and implication of results – this relates to Ball’s fallibility argument too. It emphasizes that this is your best analysis based on the available data, but also leaves room for what is to come. Future research could reinforce or contradict the current interpretation.

The latter seems to be the most important use. In this situation “we” has real meaning. It conveys something about the relationship between the scientist and the science.