Thursday, December 1, 2011

Faster than the Internet

A Chinese company, BGI, employs 167 DNA sequencers, who altogether produce about 2,000 human genomes worth of data every day. Apparently they collect so much data that they outpace the maximum information transfer speeds currently available on the internet, so they have to send out disks with the data on them using FedEx. I actually find it humorous that we are so good at reading DNA sequences now that we have to revert to a somewhat archaic means of data exchange rather than just using the internet. According to the article I read, DNA sequencing is accelerating much faster than Moore’s law allows computing to accelerate – this reminded me of the several mentions and discussion of Moore’s law we had throughout class. I can’t seem to justify sequencing such a vast amount of DNA if the data is too vast to be thoroughly analyzed. It would make much more sense to me if DNA analysts just ramped down their examination pace a little bit and spent more time analyzing the data they extracted.

Supposedly, it’s going to cost under a thousand dollars for one person to have their entire DNA sequence read and examined within a year or two. This is really cool, but I am skeptical, in that I wonder what we can actually do with that data at this point in our technological development. Are we already able to diagnose medical issues by finding irregularities in DNA? I would think that something as scientifically and medically significant as the ability to diagnose problems preventatively via the use of DNA sequencing would have at least made the news, but I have heard next to nothing about this. I also wonder if, when this technology becomes mainstream, people will argue about its ethics. People argue endlessly over the moral correctness of stem cells and stem cell research and I see nothing stopping them from bringing up the very same or similar points about DNA sequencing technology.

I look forward to the future of DNA sequencing as a medical aid, and I especially look forward to it when tie comes cheap enough for everyone to use. Perhaps at some point in the future every newborn child will have their DNA sequenced within the first hours of their life, and perhaps that sort of practice will eliminate infant mortality. Perhaps people will live for 200 years once we can prevent and cure diseases before they become problematic. Pretty much everything seems within the adjacent possible of DNA sequencing and repair.

Thursday, November 24, 2011

Brain Implants

As a continuation of the themes we talked about when we read Neuromancer, I’d like to talk about some recent improvements in the field of brain implants. Prior to a new development in brain implantation technology, the electrodes attached to implants were limited to about 100 due to bulky wires having to be attached to each and every one. This technology has been mainstream since the mid twentieth century and hasn’t changed much until just recently. The problem with such a low number of electrodes is that it can’t provide a very clear picture of brain activity, which could possibly lead doctors to overdo treatments and do invasive procedures when not necessarily needed.

Scientists at the University of Pennsylvania have recently developed a new type of sensor array that can fit 360 sensors into its much less bulky surface. It’s said to give a roughly 50 times better resolution than its predecessor, which could revolutionize the treatment of epilepsy and other disorders caused by incorrect brain function. Scientists’ biggest need, in terms of brain treatment, is to get a clearer picture of what’s going on inside the brain itself. By researching healthy brains then comparing them to unhealthy ones, they are likely going to be able to pinpoint problem areas and potentially treat them specifically.

I think this is really cool. I wasn’t even aware that brain implants actually existed at all. This reminds me a lot of Neuromancer, except it’s different in that these are necessities rather than leisure items. It’s very interesting to me that people can have their illnesses “diagnosed” with the help of a little chip that can be put in their head. It seems like science fiction, even to me. I hope that in the coming days the scientists working on these new implants will continue to increase the number of sensors they can fit per unit area, and I hope these increases will lead to new developments in the field of medicine and anatomy. I think this technology has a whole lot o f possibility even beyond what it already does. 

Saturday, November 19, 2011

Can the Internet Predict the Future?

Data is constantly becoming more and more available across the internet as people rely more and more on computers for their day to day lives. With these ever expanding databases available to the general public, companies have begun attempts to pore through them and create “temporal indexes” of tranding statements on the internet. These programs work by searching through vast numbers of websites, of all types, and looking for predictions. One major company involved in this kind of industry is Recorded Future, which claims to search through 100,000 websites hourly of any type imaginable. Generally, their services go for nearly $10,000 per month, but lately they have offered a more individual version of the product for less than $200 a month. That’s something many individuals could afford, which makes me wonder, why would anyone need to predict the future like this?

I have a lot of skepticism about this kind of technology. First and foremost, I would like to know how the programs distinguish between factual information and speculation or opinion. I simply do not see how a computer, reading information at the rate of 100,000 websites per hour, can separate out the useful information from the junk. Assuming, however, that it can, there are more questions as to its methods. How does it treat information that is only up on a website for a short time then is taken down? I don’t think this should just be treated the same way as any permanent information, as there was obviously some reason for it to be removed, be it false, slanderous, or otherwise. Apparently Recorded Future’s more expensive packages are purchased by hedge funds who use them to “predict the future” and enhance their trading profits.

Assuming that these technologies are legitimate, and as reliable as they are said to be, they are extremely interesting to me. The hedge fund product predicts stock values five days in advance, which is absolutely remarkable. I would love to see some sort of data concerning the accuracy of its predictions. Five days is a lifetime in the stock market – someone with that long a foresight could easily out trade their competitors, who were bound to working with real-time data. This seems obvious to the experienced investor, but these programs have proven that the news really does affect the stock market. It’s said that when people expect corporate restructuring, the stock of the company in question has a tendency to drop approximately 2 percent over the next month. I think this is proof of a previously only intuitive concept.

I actually fear the day when everyone has access to this type of technology. If everyone is trading stocks with a five day window into the future, how will that affect the market itself? I can’t see how it could possibly retain stability with millions upon millions of people trading with knowledge of the future. 

Sunday, November 13, 2011

Why Using the Internet is Getting More Dangerous

Scientists are worried about the future of data transmission throughout the internet. People seem to consider the Internet as a permanent thing that can never be absent, but that is actually incorrect. Supposedly, Amazon and Google have both experienced outages in their respective business and mail clouds (basically a huge shifting conglomeration of data) due to a sort of “Internet failure”. I had never considered the Internet as something that could break down, but it does make some sense that it has its weaknesses. The internet bears a larger and larger data “burden” by the hour and it isn’t some untouchable, invincible thing – logically, it is bound to falter at some point or another.

Being that almost everyone uses the internet daily for a range of tasks, varying in importance to their lives, it’s actually kind of scary to consider the possibility of some sort of “Internet blackout”. Think about what you do every day that involved the internet – the first things that come to mind are probably trivial, such as social networking. However, some deeper thought will likely lead to more significant uses: checking the news; staying in touch with family members; making purchases or sales of things. Imagine the implications an all-encompassing internet outage would have for the stock market or for financial transactions in general. Banks wouldn’t be able to access their stored information which would either dramatically complicate or altogether halt money transfer, and commerce would slow or stop.

In addition to the possibility of “overloading” the internet to the point of its failure, there is also the possibility of increased hostility of users of the internet. If data transfer is 100 times faster, you can fairly logically state that hacking, identity theft, and other malicious activity via the internet will become 100 times faster. This means that money needs to be spent on beefing up security systems. Just today (Nov. 13th), there has been some sort of scandal going on across facebook as a result of the actions of the hacking group “Anonymous”. If the internet were 100 times faster, the obscene material being broadcast by this group would circulate that many times more quickly. While in this particular case, no one is being physically harmed, it’s quite possible that more immediately harmful things could be done via the internet in the near future. Hopefully I don’t sound like some sort of crazy doomsayer, but these things seem pretty feasible to someone who has a basic understanding of how networking works, and I do not want to fall victim to any form of abuse over the internet.

Friday, November 4, 2011

Virtual Reality

Virtual reality, in its most base definition, is the use of computing technology to simulate real events, place, people, or environments. To date, we haven’t really created virtual reality technology like that in Neuromancer, but I’d say we are on our way there. Tron (the original version, not the recent remake) is considered to be the first movie that toyed with the idea of virtual reality, back in 1982. It’s really interesting that in the less than three decades we have even conceived of virtual reality technology, we have come so far. I’d like to talk about a few of the more advanced forms of VR technology currently out there right now.

The US Military uses virtual reality to train people. One of the bigger implementations of virtual reality as this kind of mechanism is its use in aircraft pilot training. The Air Force currently uses flight simulation to give their training pilots a “taste” of what flying is like before they really get into the air. This is potentially lifesaving, in that it can let flight instructors know what weaknesses a trainee may have before they actually get behind the joystick and fly a real plane. The problem with this type of training is that it cannot yet approach the realism of actually flying a plane. Consider yourself in a math class, taking a math test. If you know that test is a practice version, chances are high that you won’t put forth as much effort on it as you would if it were a final exam. The test itself might have the same questions, but the stakes of the test are different. This same principle applies in the context of virtual reality as a flight training mechanism.


Perhaps a less immediately significant use for virtual reality technology is in the gaming industry. We’ve discussed the huge uses of VR technology in gaming in class, but I’d like to explore what’s already out there. As most of you probably know, Xbox 360 and Wii both have motion sensing technology (the Wii is based around it entirely, whereas the Xbox has an optional Kinect feature). One of the biggest limitations of virtual reality technology in gaming is its inability to project images as actual size. Most of us, unfortunately, do not have access to a room that is made up entirely of screens on all four sides, the floor and ceiling. This is what I picture when I think of the future of virtual reality gaming: a room made up of nothing but screens, and a suit for the player of the game to wear. His or her movements will determine how the avatar in-game moves. I’m personally really looking forward to the advent of technology this complex, but I am unsure as to when we can expect it. Sooner than later, hopefully!

Monday, October 24, 2011

Artificial Intelligence

I was doing some reading about artificial intelligence (AI) in conjunction with my reading of Neuromancer, and I stumbled across some really interesting (although arguably frivolous) creations. The first of which, designed by a Vietnamese robotics firms called TOSY, is a ping pong playing robot. Apparently, the robot has undergone two remodels and is now in its third form, TOPIO 3.0. This robot played against humans at a robotics fair in Tokyo recently, and is said to have held its own. This is absolutely amazing. Ping pong is (and I speak from experience) one of those sports that requires insane amounts of coordination, and TOPIO 3.0 was able to return 10 shots in a row playing against a human. One might say this is a waste of money, or a waste of robotics expertise, but if you ask me, this is astounding.
                                                                            
Another such artificially intelligent creation is the famous Jeopardy-playing robot Watson. As we all know, Watson dominated its competition in each and every game of Jeopardy it played, beating out even the stars Ken Jennings and Brad Rutter by large amounts of money. Again, I find this truly fascinating. Upon doing some research, I found that Watson contains 90 IBM servers and 16 TB ram (for those of us who aren’t computer savvy, that’s about 2000 times the amount found in the average high end NC State laptop (most 2011 era laptops have 4 or 8 GB ram standard). This is ridiculous. Apparently Watson can process the data of roughly a million books in a second.

I think that if you consider how much raw data can be interpreted and used by supercomputers in a short amount of time, it will appear inevitable that at some point, computing technology will reach the point where it can interpret things on its own. I have no idea how or when this will happen, but I am interested to see how long it takes for Neuromancer to become even more accurate. I don’t really think that improving artificial intelligence to unheard of levels is that far off; I think it will be sooner than later. Hopefully our version of Turing keeps a strong hold on it, if it turns out to be selfish or malicious. This is one of those things that is really interesting just because no one knows what exactly will happen.

Monday, October 10, 2011

Green Versus Green



As a chemical engineering major, I had previously assumed that members of my prospective profession just create things without having to worry about environmental consequences. Surprisingly, and perhaps disturbingly, this assumption was somewhat correct up until recently. It seems that when chemistry first became a large trade, in which many products were produced more cheaply and efficiently than ever before, environmental concerns were next to none. However, starting in the nineties, some scientists began conducting their chemical analyses and experiments with green motivations. Rather than creating things at any cost, be it mercury tainted water supplies, smoggy atmospheric conditions, or some other pollution, chemists are more and more beginning to go green.

This “green revolution” in the fields of chemistry and chemical engineering makes me wonder what exactly we are sacrificing when we concentrate on one green over the other. Some companies are typically green in the sense that they operate for pure profit and disregard the environment as much as possible; that is, they meet industry requirements, and that’s it. However, the trending thing is to be green in the environmental sense of the word, to take responsibility for their emissions, effluents, and wastes as they are dangerous to the fragile ecosystems in which they are released.

If we were to focus purely on efficiency and profit and completely ignore environmental concerns, would the extra quality of products warrant this decision? Would the products we produced under this mindset even be better than those manufactured under different conditions? These are the questions that will have to be answered in the near future by those of us who make things, and as far as I can tell, every one of us Franklins will have to deal with this dilemma once we reach the workforce in 4 or 5 years. I personally am looking forward to being on the crest of a wave of green. Where that wave eventually breaks I am eager to find out.

Sunday, September 25, 2011

Above and Beyond Expectations


Before recently I had never considered the complexities involved with the disposal of hazardous chemicals. Sure, I learned a little bit about the dangers of toxic waste being dumped into landfills in high school environmental science, but I definitely had not considered the fact that certain things contain these chemicals. I pictured hazardous waste disposal as barrels of chemical being dumped into the landfill, not fridges being dropped in there and leaking their coolant over time. However, it seems apparent to me now that the issue of getting rid of harmful chemicals is a multifaceted problem. For one thing, how do you determine what can and cannot be put into a landfill? If something contains dangerous substances, can you put it into a landfill intact and hope that it doesn’t ever rupture and allow its contents to infiltrate the groundwater? Certain chemicals can be deadly at or below concentrations of just a few parts per billion. It’s actually pretty scary to me to think about what type of substances could be present in my drinking water, or in the water that I use to shower, by carelessly disposed of waste.

It was reassuring to stumble across the article that motivated me to write this blog. The article talked about the dangers involved in disposing of old refrigerators that still contain their original coolant. Certain recycling companies have recently taken initiative to use robotic systems to “squeeze out” the excess coolant so that it can be disposed of separately and less environmentally harmfully. They have developed a technique in which they compress the insulation of the fridges into pellets which can be burned and used as fuel. These pellets do not release noxious gases or environmentally degrading chemicals. It makes me wonder how many fridges with these toxic substances have been disposed of in landfills already, and how much waste has made its way into the groundwater. I also wonder to what extent this technique of coolant wringing out will become popular, if any. I personally hope that it becomes more accepted, because I strongly dislike the thought of drinking water laced with trace amounts of fridge coolant.

Thursday, September 15, 2011

Google's Instant Search

I was searching the internet working on a physics assignment earlier this week and it struck me that Google’s search finisher (the thing that pops up with suggestions in a list underneath the search bar as you type) probably saves a whole lot of time, in addition to providing mediocre entertainment in some cases. I quickly abandoned my rapidly-approaching-its-deadline physics homework to do some tangential research, in which I found out that the magnificent search completer referenced above is called “Google instant”. According to Google, the instant search feature saves users an average of 2 to 5 seconds per query. Prior to Google instant, the average search took 9 seconds, and some searches took upwards of 30 seconds. Further statistics reveal that Google users are saved around 3.5 billion seconds per day, or 11 hours per second. I find this absolutely incredible. This number of seconds (equivalent to nearly 111 years if my math is correct) is enormous. It really speaks volumes about the sheer number of people using Google and the internet on a daily basis.

I think it is very interesting that Google went through the process of creating a new technology to save people less than 5 seconds on a search. It makes it apparent that search technology has reached a point where efficiency is nearing the best conceivable level. Not only did they have to develop 15 new technologies to get this instant search thing working, they also had to ensure that the feature, when being used on slow connections, would not end up being detrimental. I didn’t even know that the feature was toggle-able but apparently you can go into preferences within Google and turn it off manually. However, I don’t see any reason to do so, seeing as I have become accustomed to having Google read my mind and fill in the rest of my search bar with something close enough to what I am looking for to save me a valuable 3 seconds. In fact, I can hardly remember using Google at all before the addition of their instant searching technology. It just goes to show you that, as is the case here, with Facebook, and countless other new technologies, we as human beings have a hard time remembering things as they were before we had access to such luxury as social networking and accelerated searching.

Monday, September 5, 2011

Facebook: An Involuntary Waste of Time?


Entertainment is a huge part of the life of every modern American citizen. Even those who are not themselves partaking in forms of recreation are being affected by its prevalence on a daily basis. It is estimated that 700 billion minutes are spent by Facebook users every month[1]. I find it interesting to think about this from a different angle. What if, instead of surfing Facebook and doing arguably nothing productive, the people instead spent their time reading, or otherwise attempting to learn? What if they spent their time exercising? If the amount of effort spent on social networking were expended on more “positive” and “useful” activities, I think that great benefits would come more soon than we may think.

On the other hand, social networking is a setting in which advertisements run rampant. If significantly less people were to partake in social networking on a day to day basis, these advertisements would be much, much less effective and economical. Companies would have to once again pay for more expensive means of advertisement, such as running more ads in newspaper and magazine, and airing more commercials, rather than just letting their Facebook page circulate around through circles of users. Facebook is interesting in that it is an enormous billboard on which companies are free to advertise (although I’m sure there are situations in which partnerships exist, generating some of the massive revenue that Facebook rakes in). I can’t think of any other means by which marketing is “free”, to this great an extent.

It is phenomena such as these that make social networking sites interesting to me. While I enjoy the ease of communication I am given by Facebook, it really doesn’t give me much that I couldn’t get elsewhere. If I were to disable my Facebook account, I would just be forced to use email and text messages more. I don’t think that it would be too painful, but then again, I haven’t ever deleted or disabled my Facebook, so maybe that fact in itself says something about the grip that it has on its users. It’s almost as if people enjoy being shown advertisement after advertisement, yet are unaware of what is going on. I am interested to see where social networking goes in the future. I imagine that within the decade, Facebook will either cost money to use, or will cease to exist. The company is becoming such a dominant force in the realm of social networking that it seems inevitable that it will begin charging for its services.

And if intelligently priced, I doubt it will lose that many users at all.

Thursday, August 25, 2011

Temperature Control


It is definitely easy to take important technologies for granted. I find that the technology I least appreciate is that which I am not in direct control of. I have my cell phone and my laptop with me all the time so it is hard to forget that they are important to me (and that they cost quite a lot, and that I am unwilling to leave them unsupervised for even a trip down the hall and back). However, only when I venture out into the oppressive heat of midday Raleigh do I fully appreciate the value of the air conditioning, refrigeration, and running water that I routinely use to keep cool.

It’s amazing to me, having thought about it more today than I ever had before, that we have the technology to cool down millions of people simultaneously. The numbers behind this feat are probably extraordinary. I would be interested in seeing just how much electricity is used by, say, everyone in the Triangle on air conditioning. I would also like to know how much is used on refrigeration and cooling of food and water, and other essentials. I think that by comparing these numbers with other common uses we would find that an enormous part of our energy expenditure goes to keeping us cool (or in the colder parts of the year, warm).

This is intriguing to me further in that it makes me wonder if this is an opportunity to save energy. Why can’t we approach the energy problem by breaking it into parts, and approaching those individually? It certainly wouldn’t hurt to make homes and other buildings more resistant to outside temperature changes, and therefore less in need of temperature control. It wouldn’t hurt to make refrigerators, freezers, and heating and air conditioning units more efficient in their use of energy. In my opinion, if people were willing to tolerate one degree higher temperatures in the summer and one degree lower temperatures in the winter, lots and lots of energy expenditure could be completely avoided. One degree of difference wouldn’t be that significant to the people having to feel it, but compounded over millions of households and businesses, I don’t think the savings would be anything to sneeze at.

Monday, August 22, 2011

Fast Food: Is Haste Waste?


A few weeks ago while vacationing at the beach I was watching television and came across an hour long show about the recent explosion of development in and general history of fast food technology. Needless to say, I was intrigued. On one level, it struck me as very impressive that fast food chains have reached the point where they can get you your meal, in many cases, within a single minute of ordering. I felt some sense of pride, being that many of these companies are American-based. However, it also seemed odd to me that so much money is going into research and development of new technologies aimed to shave seconds off of the fast food’s production time, while a vast number of the people in the world are undernourished, much less able to go to a fast food restaurant.

I realized that I know very little about what research is in progress and what technology exists in the context of feeding those who cannot acquire food themselves. I have made the assumption that at least some attention is being given to this issue, but to what degree I am quite unsure.

In 2010, surveys revealed that nearly a billion people in the world were hungry [1].  That being said, billions of dollars were, and continue to be, poured into research and development of faster and more efficient fast food technology. This issue raises some questions with ranges of arguable answers. First, should money be spent on the impoverished people of the world which are unable to eat, if they have no money of their own with which to provide an economic incentive for companies to do so? Staunch capitalists would tend to argue against provision of money for the benefit of these people, but those with a more humanitarian outlook on the matter would undoubtedly want to take some sort of action towards helping this large fraction of the world’s population. This brings us to our second important question: what exactly should be done about this problem? At least to me, there is no clear solution to a problem in which much money needs to be spent but there is little to no chance of recovery of capital. This seems like it should be a hotly debated issue but I believe it is eclipsed by more “urgent” issues. As widespread hunger has been present for a very long time, people may actually consider it normal and may therefore be less inclined to seek out a solution.

I would love to be corrected if my assumption (that the research regarding the feeding of the hungry is being done on a much smaller scale than that of making a cheeseburger three percent juicier or cooking fries 2.8 seconds faster per batch). I have not been able to find much evidence to suggest my assumption is wrong, but I would be happy if it existed somewhere. I don’t feel quite right about this neglect of those in need in favor of servicing those who are too impatient to wait thirty more seconds for their meal.