Remember virtual pets? Those little electronic animals that lived in keychains, and you had to feed them and clean up their poop and they were really neat for about two weeks, before everyone* realized that pressing buttons to pretend to feed and play with something is totally boring?*except Japanese people, who apparently still buy them in great numbers, but when it comes to adorable tchotchkes, they're always outliers anyway.Ted Chiang's novella The Lifecycle of Software Objects will very likely remind you of virtual pets: in the near-future, a company creates a breed of adorable digital creatures that exist only online in virtual worlds (it seems in the near future social networking will be less "Facebook wall" and more " elf avatars walking around" a la Snow Crash, which just sounds annoying to me and my slow internet connection). The creatures (called digients) are really advanced, and can learn and adapt and even speak, but they also need a lot of love and attention, sort of like the Tamagotchi whose screen will fill up with poop, causing him to evolve into an angry blob, if you don't pretend to take him for a walk three times a day god those were the stupidest toys. So for a few years, digients are the hip new fad, but then most users realize that it's a lot of work caring for a pretend animal thingy, and put their programs on pause or stop using them altogether. But are the digients really pretend creatures? A select group of designer/users, who have had their digients the longest, become extremely dedicated to seeing how far they will evolve, paying for tutoring, watching real-life relationships fall apart in favor of time spent in the virtual world, and dealing with software obsolescence issues that require them to make some unsavory choices about the future of the virtual "species."This is a book of ideas, not a book of characters -- the humans have stock personalities and conflicts that won't interest you too much, and the majority of this short book spends its time setting up a bunch of interesting what-if? scenarios that are futuristic but not far-fetched, like if a computer program could somehow be programmed to experience discomfort, would we be morally obligated to protect it from harm? What is the line between programmed intelligence and actual self-awareness -- and if a computer program is arguably as self-aware as a cat, do we have any moral responsibility toward it? Am I a total ass for making my underfed Digimon fight to the death and then abandoning them until their batteries ran out? If a software program can be programmed to love you in a sexual way, is that gross? (Yes.)Then there's all the commentary on The Way We Live Now, i.e. wtf, internet? Because the main characters spend so much of their lives focused on and caring for these creatures (the book spans a decade) that they are hooked up to their computers virtually all the time, which seems excessive, but then there are those people who clock literal months worth of real time playing online role playing games, spending hours of time that could be spent racking up real money and life experience making fake gold and earning magical experience or however it works. Also there are huge farms of people in Asia who do nothing but earn virtual currency all day so nerds in the U.S. can pay real money for fake on eBay, and isn't that weird? Exactly how real are these interactions? How much value do they have compared to the face-to-face communication humanity has had to rely on for the last 6,000 years?That's a lot of interesting material for a slim little book like this. It even has funny pictures. It's more fun than a litter of Pikachus.