With movie theaters closed and no box office revenue, 2020 was obviously a very weird year for film. The Oscar nominees are almost always “prestige films” that aren’t really meant to be crowd pleasers, but this year’s crop takes that to a whole ‘nother level: the studios simply held back on releasing anything that audiences were supposed to actually enjoy. The one exception I can think of was Christopher Nolan’s Tenet. I don’t follow industry news, but I’ve got to imagine that part of the push was that they suspected the film could be in line for more than the Production Design and Visual Effects nominations that it got. Unfortunately, Tenet indulged Nolan’s penchant for complex technical filmmaking and movie-as-puzzle-box without offering either characters with any depth or a plot that audiences could enjoy instead of trying to solve like a math equation. The point is that most years at least a few big-budget crowd-pleasers make it onto the list (Once Upon a Time in Hollywood; Black Panther; Dunkirk; La La Land; Mad Max; The Grand Budapest Hotel; The Wolf of Wall Street; Django Unchained; Moneyball; Inception…). This year there was only one such swing, and it was a miss.
Unlike last year, I thought all of this year’s best picture nominees were good films. That said, I found ranking them more difficult than last year; I didn’t really love any of them, and there were a few that I respect and can see other people enjoying, but that just didn’t speak to me. (My classic example of such films are Blade Runner, which I just don’t enjoy, and 2001, which I adore, but which I can see boring reasonable people with good taste.)
I didn’t think there were any truly great films among this year’s crop of nominees. Here’s what I thought of them, from worst to best:
Throughout December, I’ve been taking part in the 2018 Advent of Code, a casual contest which releases two new puzzles every day at midnight, each of which can be solved by writing a little computer program. I coded all my solution generators in Python, and have made a repository of my code publicly available. I haven’t gone back to clean any of this up after getting the right answer, and (as I detail below) the race for time very much changed the way I program. What you see in my repo is definitely not great engineering practice, and in many cases is pretty ugly Python, because that’s what came out of my fingers first (or was expedient in the course of debugging).
Annual Oscars post:
When Apple made a phone, it turned out it wasn’t really competing in the handset business; it was competing for the next dominant personal computing platform. The more I think about an Apple car, the more I think that it might be the basis of their future “computing environment”: a space that is completely aware of and responsive to its occupant(s). In that sense it might be more of a long-term competitor to the Amazon Echo (and whatever Android variant Google is pitching at the same space) than to Tesla’s cars.
The old “What if they hired carpenters they way they hire programmers?” joke/commentary didn’t sit right with me the first time I read it, and after stumbling across it again I now see why. Among other things:
The costumes may change, but my 2011 commentary remains remarkably relevant. No need for a full play-by-play; we can skip straight to the awards.
There’s a specific form of logical fallacy or cognitive bias that I’ve never seen explicitly listed in collections of such fallacies or biases. It is related to the “Fallacy of False Cause” and to the “Illusion of Control” bias. I call it the fallacy of causation, or the fallacy of the single cause.
MG Siegler recently opined that the reason Android is having some success against the iPhone but little against the iPad is because of support from mobile carriers. John Gruber linked to Siegler’s piece, adding:
Is there any better demonstration of scientific culture, and the ways it differs from other fields, than the faster-than-light neutrino flap? Consider:
A few days ago, the Wall Street Journal published an op-ed written by a collection of scientists claiming distortion of science in support of climate alarmism. I don’t necessarily agree with everything they wrote, but their central point seemed quite sensible: whether or not “drastic actions on global warming are needed” is not something on which all scientists agree. I’d go farther and say that it’s clearly not even a scientific question; it involves a great deal of politics (i.e. how do we balance different values as a society) and economics (how will various types of “drastic actions” affect our ability to address these different values). But the main thrust of the letter is that a climate change orthodoxy is being imposed upon the scientific community to support this political stance. The letter provides a few examples.
My predictions for 2011 were my worst yet. Like last year, I don’t have many unique insights into the next twelve months. Unlike last year, however, I’m not going to try quite so hard to pretend I do. Here are six modest predictions for 2012:
This site is clearly in need of some serious updating (in terms of both content and layout—I only recently discovered the atrocious rendering under IE). Hopefully I’ll manage to get around to that at some point soon. For now, however, I’m just back for my oh-so-brief annual self-shaming: I’m reviewing the results of my predictions for 2011.
Let the objectification begin! (/continue!)
Three questions for David Cameron:
There’s been plenty of digital ink spilled over the patent system in the wake of Google’s shameless hypocrisy on the matter. The conversation seems to move pretty fluidly between discussion of patents in general and discussion of software patents. My understanding is that the portfolio Google is complaining about involved more than just software patents, but I may be wrong; the point is that arguments about software patents is at least partially orthogonal to the Google situation.
I moved into a small unfurnished apartment in Brooklyn a few months ago, so I had a relatively blank slate to start from in setting up a work area. I’m in the company office most days, but try to work from home one day a week, in addition to the hours I spend working at home on nights and weekends.
Most of the coverage of the current debt limit debates that I’ve seen has suffered from what I consider the single greatest problem with mainstream news reporting. It focuses so much on the here and now that larger story lines are overlooked.
David Pogue doesn’t think much of the Samsung Chromebook, but does like the attempt:
Another year is dead and gone, so I suppose it’s time to review my predictions for 2010:
I’d like to congratulate Oxford’s environmentalists for another outstanding effort at Christmastime carbon reduction. The between-term travel of Oxford’s huge student body causes an absolute explosion of emissions—a single return flight from Oxford to the US, for example, represents roughly 20% of an average person’s annual emissions—and so the focus that this period receives is well-deserved.
(Preface: I use the term “social issues” here to refer generally to issues of concern to society, including everything from civil rights to economics to natural disasters. Such topics are sometimes categorized as either current affairs or political issues; I feel that both those terms carry baggage—reaction-driven in-the-moment decisions and electoral tactics, among other things—that is best introduced independently from the underlying issues.)
The news has been dominated this week by “cablegate”. In short, 250,000 classified reports from US diplomats to the US State Department were leaked to a group called WikiLeaks, and WikiLeaks is publicizing the entire set. Although the policy revelations contained in the reports released so far have merely helped to confirm long-assumed truths of international diplomacy, the extremely candid assessments of foreign officials given by diplomats are quite embarrassing to all concerned. Many politicians and government officials in the US consider the release espionage (or even terrorism) and are demanding legal action.
It’s hardly a new phenomenon, but the public “debate” over health care reform in the US focused primarily on opposition that took the following form:
There’s been an amusing back-and-forth over a relatively recent book, Disrobing the Aboriginal Industry: The Deception Behind Indigenous Cultural Preservation, which questions whether “cultural preservation” is doing more harm than good for aboriginal populations.
New York governor David Paterson has been trying to get the Park51 project to change their plans to build a mosque in lower Manhattan. Like a true politician, his reasons involve no principle other than the avoidance of an unpalatable political debate. He is reportedly “trying to bring people together on the issue”.
An email that just went round to Oxford’s entire computing laboratory:
As has been linked on various tech sites, a possible proof of one of the most important outstanding problems in theoretical computer science is currently under peer review. In light of this, it has quickly become the fashion for everyone to pretend that this has significance for the larger tech community.
There is a scale of commitment for any competition. At the low end, it’s not terribly relevant who wins a friendly contest—sometimes the participants barely notice who wins a game like charades or pictionary, in which scoring is mostly an afterthought. In many other casual games people do attach some importance to winning, but it’s usually kept in some perspective: you want to beat your buddies at poker, but it’s not worth anyone losing their house and life savings over.
Some media and the usual cohort of environmentalists have once again decided to disengage their brains and embrace some bullshit to bolster their narrative, as is their wont. The latest is a story that solar power is cheaper than nuclear, based on a report that some stories are calling “a new study by two researchers at Duke University”. Despite that sciencey description, this is not a peer-reviewed paper and it’s not from independent researchers. It’s a position paper published by NCWarn, an environmentalist organization whose primary goal is the elimination of nuclear power.
SlashFilm recently conducted an interview with Armond White, the notoriously contrarian critic who panned Toy Story 3 (destructive consumerist themes) but loved Transformers 2. Reaction from David Chen of SlashFilm is here.
The iPad version of iBooks got support for PDFs in a recent update. Overall support is quite good, however the iPad screen size is roughly half the size of an A4 or 8.5 by 11 page, so text can be very small. In many cases work formatted for the printed page includes huge margins, whether to accommodate binding or reformatting for different paper sizes, or just to facilitate holding a page without obscuring text. None of this is necessary on the iPad, so it’s often very useful to crop PDF pages in order to devote more of the screen to content.
For small competitions designed primarily to choose a single winner (e.g. elimination tournaments) it makes sense to recognize only the top few places; traditionally the top three places are awarded gold, silver, and bronze medals. In larger competitions in which a large number of teams are reliably ranked, however (e.g. contests where each individual strives to achieve the highest score, independent from other competitors), it can make sense to recognize more than the top three: “top ten” lists are not uncommon. Unfortunately, there is no agree standard for medals beyond the top three spots. For the good of humanity, I now proclaim medal types for the top ten finishers in any contest:
There are four different networks providing plans for the iPad 3g in the UK, and all of them offer either pay-as-you-go or rolling contracts. What’s more, they all offer SIM packs for free, and at least some of them give you some initial credit with your free SIM, so it’s worth giving them all a try.
Well, for teachers, at least.
Concern over global warming had led to a steady stream of advice on how you can cut your “carbon footprint”. Eating vegetables instead of an average serving of chicken or pork, for example, is claimed to save on the order of half a kilogram of carbon dioxide equivalent emissions. The environmentalists I have come into contact with in Oxford go so far as to recommend reducing energy consumption by taking cold showers: heating the water for a 25-gallon shower with an inefficient heater can require up to five kilowatt-hours, for 2.5 to 5 kg of CO2-equivalent. I’ve seen lengthy lists of recommendations. Most with no numbers for actual savings attached, and most including items that either save nothing at all or actually increase consumption—“drive instead of fly!” is a particularly egregious example of confusing inconvenience with energy savings.
Robbie Bach, retiring president of Microsoft’s Entertainment & Devices Division, on Windows Mobile’s continued loss of market share to the iPhone and Android:
I went to a talk last week by Stephan Lewandowsky entitled “Climate Change: Consensus or Dogma, Hoax or Religion?” I can only assume that someone other than Lewandowsky wrote the title, because his view was fairly simple. Not dogma; not hoax; not religion. Just consensus.
I think “democracy” is the worst-understood concept in politics. Or, rather, while there may be more confused notions—capitalism comes to mind—people usually have some awareness of the issues’ complexities and are at least a bit wary of invoking their names as absolutes. Statements like “mandatory health care is anti-capitalist” provoke some minimum amount of reflection on the meanings of the words used; reaction to statements like “judicial review is undemocratic” tends to skip right past such parsing and on to consequentialist and historical argument.
It’s not given to human beings to have such talent that they can just know everything about everything all the time. But it is given to human beings who work hard at it—who look and sift the world for a mispriced bet—that they can occasionally find one.
And the wise ones bet heavily when the world offers them that opportunity. They bet big when they have the odds. And the rest of the time, they don’t. It’s just that simple.
Obviously the software economy is very different from that of other businesses. The most widely-acknowledged difference is in the split between overhead and marginal costs: while prices for many physical goods have historically been dictated by the costs of production, producing one extra copy of a software product is effectively free.
The health-care industry and various research communities (among others) make heavy use of “ethics panels” these days. Such panels are usually mandated to take a broad view of how specific actions will impact welfare: to what extent is it permissible to mislead someone in the course of a research experiment? would the knowledge gained from a particular experiment justify killing a dozen mice? when are patients competent to make decisions about refusing treatment?
In order to try to stifle criticism of their faith, some Muslims have threatened and sued and murdered. Leaders from Islamic countries (who can hardly be called fringe figures) even managed to pass a UN resolution banning any criticism of religion (Islam in particular), noting:
It’s widely accepted that in India, China, and other countries the best and brightest students choose courses and careers in engineering and the mathematical sciences, while in the US a much greater fraction of elite students instead choose subjects in the humanities, such as history and literature, and (largely) non-mathematical sciences, such as anthropology and psychology.
Proponents of faith as a virtue frequently argue that even science is based on blind faith—faith in causality, or faith that the universe obeys laws accessible to human intelligence, or faith in some specific underlying principles. In his book An Enquiry Concerning Human Understanding, 18th-century philosopher David Hume argues that science is based on “the principle of the uniformity of nature”—that patterns observed in the past will continue to be observed in the future—and that this principle cannot itself be logically derived. A recent post by a friend of mine summarizes some of Hume’s discussion.
George Will, columnist for the Washington Post, is an intelligent man. His latest column raises an interesting point of constitutional law. It also demonstrates perhaps the fundamental failure of newspapers in furthering real debate on political topics: a tendency to focus on the minutia of issues instead of providing perspective.
I hate Daylight Saving Time. It’s a huge pain to administer (I lose track of which clocks change themselves automatically and which need to be reset by hand), it’s handled inconsistently from region to region in the world (making managing international scheduling and communication more difficult), and it disrupts everyone’s sleep patterns.
Jason Kottke makes an interesting observation about recent reviews on Amazon: people are giving the minimum possible ratings to books and movies that they actually quite like. In the case of the Lord of the Rings films on Blu-ray, reviewers are complaining that only the theatrical versions are available, not the extended cut versions die-hard fans want to see. For the recently-released book The Big Short, Kindle owners are up in arms that only the hardcover has been released, not the e-book.
There’s a piece from James Surowiecki in the New Yorker about the collapse of the “middle” of markets: companies do well shooting for high-end, high-margin offerings, or for low-end budget products, but not the stuff in the middle.
Jeremy Beer offers a thought-provoking essay on how meritocracy is killing Middle America:
If it weren’t published by InfoWorld, I’d think that this were a brilliant bit of satire:
I’ve had a hard time finding a concise explanation of the rankings of poker hands that’s suitable for use as a reference by new players. In particular, the rules for comparing between hands of the same type (e.g. if two players have flushes, which one wins) are generally described in a very ad-hoc manner.
I’ve written about netbooks before, but a recent post from Jeff Atwood drove me absolutely crazy. Atwood is working to take Cringely’s place as the only consistently-wrong blog I subscribe to. First he quotes Dave Winer’s definition of a netbook:
I [did it last year] (http://v.cx/2008/12/predictions), with fair results, so it’s time for another try. These are my predictions for 2010.
In his last broadcast of NewsHour, Jim Lehrer presented a list of “guidelines of practice” for good journalism:
Before moving on to my predictions for 2010, I’ll take a look at how my predictions for 2009 worked out:
I haven’t really been following Michael Arrington’s attempt to mass-produce a tablet computer for web browsing. I read the initial announcement and dismissed the project as a pipe dream from yet another blogger who thinks having a lot of readers means you know what you’re talking about. This week Arrington announced that the project had failed.
If you don’t feel that you are possibly on the edge of humiliating yourself, or losing control of the whole thing, then probably what you are doing isn’t very vital.
Oxford University offers a “Virtual Private Network” service to its students, faculty, and staff. The most common reason people need to use this service is to get access to the wider internet using the Oxford Wirless LAN service. In fact, in almost all places where the OWL wireless network is available, the eduroam network is also available. Eduroam is a UK-wide network available at most major universities in the country and it does not require use of any special VPN, so I highly recommend that anyone new to Oxford take the trouble to configure their computers for eduroam instead of OWL. (The lesson, I’m afraid, is that IT services are better when they are not designed by the IT staff at Oxford.)
From an excellent case study on the practical impact of GPL licensing:
Agronomist Norman Borlaug died yesterday. His work is largely responsible for increases in the world food supply that have saved the lives of billions of people. Literally: billions. Even a short interview with him gives a huge amount of insight into the technical side of a field few geeks know anything about:
I really enjoyed fantasy (American) football for the years I played, but it always upset me when players earned fantasy points for things that didn’t help their team, or were penalized for actions that did. The most common examples are scoring drives with the clock running down: a short pass to a receiver who is immediately tackled in the middle of the field is as bad as a sack in many such situations, but both quarterback and receiver get points for it. Similarly, an attempt at a 50-yard “hail mary” pass that results in an interception as time expires just shouldn’t be scored like other interceptions: it’s no worse for the quarterback and no better for the defense than an incompletion. Changing scoring to take account of such “game situation” parameters would be complicated, however, so I understand why few (if any) leagues make exceptions for such circumstances.
Raymond Chen describes an important old-media skill, writing to length:
According to Wired the average American consumes about 9 hours of media a day. The most amazing part to me is how little is devoted to television:
There’s a nice post today from the Wall Street Journal Numbers Guy about the new voting system for the Oscars. Since they increased the number of Best Picture nominees from five to ten, they’ve scrapped the “vote for a single film; the film with the most votes wins” system:
Last week, Farhad Manjoo, the technology columnist for Slate, published a piece highlighting the control many large organizations, including the US State Department, exert over their employees’ computers. He argued that such restrictive policies were often misguided. His reasons, in a nutshell:
An excellent article (which is worth reading in full) from Foreign Policy magazine makes an incidental clarification of some of the terminology used in the health care debate. It’s obviously nonsense that Obama’s plan is a “single-payer” system, but I hadn’t considered the fact that even if it were, it still wouldn’t be a socialized system. Here’s the terminology:
Writer John Scalzi has posted a collection of “design flaws” in the Star Wars universe. I understand that this is just harmless fun, but his criticism seems typical of the mindless “what a bunch of idiots!” the ignorant like to level against experts with specialized knowledge and extensive experience. In this case, fictional engineering experts. Okay, so I’m taking this too seriously. Anyway, my rebuttal:
This will be my most self-absorbed post yet. If you’re not fascinated by people telling you about their current medical conditions, you might want to skip this one.
Everyone is familiar with the spring boxes attached to doors, which usually look something like this:
I could go on at length about what the health care “debate” says about US politics and culture, but I’ll restrict myself to five points:
Jeff Atwood has been considering the profitability of low-priced software, citing the iPhone App Store and discounted games prices as examples. He goes on to opine about Windows pricing:
Something to add to the list of reasons to leave the UK:
A few obvious thoughts on the latest tablet “report”:
And now some praise for economic theory, although in this case it’s for theory presented (and possibly developed) by a writer and not a full-time economist. Hugo Lindgren’s brilliant Hot Waitress Index:
I just came across a surprising paper from economists Aaron S. Edlin and Pinar Karaca-Mandic published in Journal of Political Economy in 2006 which claims to address the “accident externality” due to driving. Here’s the abstract:
In a previous post I argued that it was Palm’s WebOS that had emerged as a legitimate threat to Apple’s iPhone; Google’s Android platform is left in third place to compete with the Blackberry OS, Windows Mobile, and Symbian—all of which now look very much like previous-generation technology. I also said this about Android:
A reader just let me know that my repackaged Digg feed recently stopped working.
Not many NBA playoff games make it onto television in the UK; usually it’s only the finals. This year I’ve been watching over the internet using the NBA’s International League Pass service.
Trying hard to be offended by the cultural stereotypes (or the odd weapon/ammunition combination), but failing. I always thought Jasmine was Disney’s hottest character…
In a post five months ago I claimed that Google hadn’t seen its peak yet. While I expect the company’s revenue to continue to climb, I’m now a lot more skeptical of its influence and credibility as an innovator.
I had the rare “opportunity” to sit in on a meeting with an IT vendor today. This vendor sells a Network Access Control product which works as follows:
The web is young enough that we have yet to achieve any real consensus on a robust manual of style. With technology changing so quickly, any detailed guide would rapidly fall out of date. But here’s a general rule that takes care of several of the common annoyances I find in even “professional”-level writing on the web:
Beautiful Soup is an absolutely terrific Python library for parsing HTML and XML. Its strength is its ability to offer a clean document tree even for bad markup (including such gory details as converting everything to unicode intelligently).
Donald Knuth is a visiting professor here at Oxford’s Computing Laboratory, and during the brief periods when he is in residence he and I frequently use the same printer. I’ve never actually succumbed to the temptation to read his drafts while I’m waiting for my own printouts, but I’d be lying if I said I wasn’t tempted.
I opened a prior post about my repackaged Digg feed like this:
When typing, some people use a single space after a sentence; others use two spaces. For the most part, I think these typing habits are meant to relate to two different typographic styles: double-spacing approximates what some call “english spacing”, while the single-spacing style is usually called “french spacing” (although “american spacing” might be more accurate for English-language text).
I didn’t expect this meme to be terribly interesting, but I ran a script to find the oldest files on my machine anyway. Beyond the fact that I have a huge number of files last modified between 1900 and 1903 or on 1 January 1970, I was surprised to discover that my oldest files whose modifications dates have been successfully preserved are photographs taken nine (!) years ago after I first tried cutting my own hair. For the record, I thought the haircut was very comfortable, but others felt the aesthetics left something to be desired.
Finally set up my Amazon Associates account, so here’s some pointless shilling. Just a few of the best nonfiction books I’ve found; each represents the best of at least half a dozen books I’ve read on the same topic (with the exception of McGee’s book on the science of cooking; I’ve never encountered anything comparable). If you read all of these, then you’ll be much less impressed when I regurgitate their contents, and our relationship may suffer.
A work in progress.
Continuing on the theme of small tools, a tweet from John Siracusa just reminded me that I really should back up old tweets. So I wrote a very bare-bones Python script to dump all of a user’s public tweets into RFC2822-style files, each named with the ID of the tweet it represents.
In what should become an annual tradition, here are some predictions for 2009. I should evaluate them a year from now alongside my 2010 predictions.
I subscribe to Digg’s syndicated feed, although more to keep track of the zeitgeist than for information—I probably actually click through only one or two stories a day (unless I’m very very bored).
Cringely’s last PBS column is up, and I thought I’d honor the last installment of the only consistently-wrong blog I subscribe to with just a touch of the analysis I wanted to give to every Cringely post. In this week’s episode, Bob looks back on his predictions for 2008:
We still can’t even really agree on a definition of “machine intelligence”, but we have learned a few things from fifty years of research. The notion that sheer processing power is enough to spark the emergence of human-like intelligence now seems misguided: the human mind did not emerge in a vacuum, and it is not a uniform general-purpose processing system.
I can’t take the ridiculous day/night split in the UK. In the winter it’s dark before 4 PM, and in the summer the sun is well above the horizon by 5 AM. Whether you live at a similarly high latitude or not, I’m sure you can appreciate my situation and I have every confidence that you are willing to do anything possible to help.
I completely reformatted the hard drives of a couple of Mac laptops a few months ago and did all the installation to make either one useful as my day-to-day machine. There are enough steps (and enough things that I forgot and then later had to interrupt my work to do) that I made a handy reference list for the next time I need to set up a new machine from scratch.
A slashdot post announces two “credible” replacements for iTunes: Songbird and Amarok. Apparently these are credible because they pack in the features, including support for extensions and themes/skins.
BetterType does for unicode what Textile and Markdown do for HTML: it allows you to author expressive documents with the tools you already use, and the “source” code looks like regular (ASCII) text. BetterType performs transformations on its input (either plaintext or HTML) to take advantage of the full unicode vocabulary, including such things as replacing
"characters with ‘ or ’ and “ or ”,
...with …, and
(c)with ©. A wide variety of transformations can be enabled, and adding new translations (which can be context-dependent) is straightforward.
I realize this is trite, but here goes:
Rejected from Apple’s app store:
Whenever Apple does a Q&A at a release event, I always wonder whether I have any questions I’d really like to ask. Usually, I can’t think of much more than those in the room do, but this time I had one.
Some great stuff from a page full of lies we teach pre-college students, including why the sky is blue:
The abstract of a 2003 paper by Utpal Bhattacharya:
George Saunders writing for the New Yorker:
I think it’s generally bad form to link to items that don’t deserve additional publicity, but this video opened my eyes to some real dangers for mathematics education.
Posted on Gizmodo:
http://www.sportsscientists.com lists the splits for each ten-meter interval of Bolt’s world-record 100m race like this: 1.85, 1.02, 0.91, 0.87, 0.85, 0.82, 0.82, 0.82, 0.83, 0.90.
Paul Lockhart, from an essay written way back on 2002:
For months now there has been an email in my “spam” mailbox that I can’t delete. I try deleting it, but I fail. Worse, if I select a lot of messages and then try to delete them all, the entire operation fails because of this one zombie mail. So cleaning out my spam mailbox is a lot tougher than just ‘scan the borderline cases, select all, delete’.
Peter Norvig has been doing some work on optimal strategies for the Beauty and the Geek TV show, based on a challenge from the Freakanomics blog. I’ve never seen the TV program, but I must admit that I found his approach to identifying the best strategy fascinating. He just mocked up a little simulator in python, coded a few simple strategies, and had them compete against each other in hopes of learning something.
I seem to have stumbled across a bug in Python’s libraries for dealing with unicode data: apparently mixing calls to
file.readlines()works just fine for regular 8-bit input files, but not for files read through a codec. Given any file
sample.txtwith more than a few dozen characters—even just ASCII characters—the version of Python 2.5.1 which ships as part of Mac OS X behaves like this:
subscribe via RSS