My girlfriend and I have been preparing to work on a new desktop computer. It's a general interest sort of thing: partially for education, partially for entertainment. The fact of the matter is, we are sinking quite a bit into to make sure that it'll last us a long time. However, something has always bothered me with this, which has recently been made fun of in a Best Buy ad... why am I bothering with this? Why bother trying to make something great when in just a few months to the next year, the next generation of parts will come out?
Simply put, technology is evolving. If anything, it's growing faster than we are. IBM's new computer Watson, a recognition software, was just put to the test a couple months ago on Jeopardy. Even with the two greatest legacy players (one set a record for most income in a given period while another set a record for most days on), it managed to win the two games it was on. It has become a technological marvel that is able to think more like a human and separate information based on word choice and the manner they're used in.
It frightens me to think that this technology will probably be considered obsolete in a decade. Not only that, it will be compressed into some smaller and more efficient, much like microchips today are as powerful as the first supercomputers, and cost only a fraction of the price. It also frightens me that what we are building will probably be outdated in a matter of 2 or 3 years. Also some depressing is that in that time, something will be twice as powerful and probably around the same price. This begs the question: is "future-proofing" really worth it?
Saturday, April 30, 2011
Saturday, April 9, 2011
The Troubles With Audio: Immersion vs. Quality
One of those most important factors to many of us is the concept of sound. Granted, I still find visuals integral in telling a story (which is the reason I'm an animation major), but I can't help but feel that audio can be the most important factor in creating something (at least when mixing the two together). There's no doubt that on their own, these creatures are two different monsters. It's to visually express sound sound to a deaf person, like the rhythmic strums of a guitar; similarly, it's near impossible to describe the full visual brilliance of a painting through words alone to a blind man, like the use of blending complimentary colors in an organic way. I could go on about both, but I want to focus on audio today. The reason being is that I just got myself the remastered version or Rush's album "Moving Pictures." Why? It came with a blu-ray (a high definition disc meant to contain higher quality audio and visuals, depending on the usage).
I listened to the album and it sounded FANTASTIC. This is no doubt the best I've heard some of this music in quite some time. Generally I've only been able to hear the music on the radio (and because of the limited bandwith, a lot it lost in the process), or go as far as playing their song "Limelight" on the game "Rock Band," which enables one to play the song in Dolby Digital (a codec that enables a person to hear higher quality sound with true surround due to processed audio separation in each of the channels). It sounded great, but despite the cleanliness of the audio, the amount of compression, there is still data lost. According to the information case, it's practically hearing what one hears in the studio, due to increased information being played.
Before I continue, I should probably explain more about the concept of data. Basically, there are several factors that one must know about: the sampling rate (the amount of times something is played per second), the bit depth (how much data is present in each individual sample being played), and the amount of channels being used (or speaker number). This results in the eventual "bit rate," or how much data is being played. The thought of this came to me as I listened to this new, higher resolution mix of the music. The overall bit rate for this music (in surround sound) was 13.8 megabits per second (mbps). Compared to a regular CD in stereo, this is about 10 times the bit rate (which is 1.411 mbps). The reason being is because for the bluray, the sampling rate was increased to 96khz from 44.1 khz and the bit depth was increased to 24-bit encoding to 16-bit encoding. For a CD, this results in about 700kbps per channel. For the blu-ray, it averages out to about 2.76 mbps per channel.
The reason I'm mentioning this is because I owned a music DVD as well involving Cirque du Soleil's "Love," involving a remix of some of the music of The Beatles. The music was on a DVD and did surround as well. The bit rate averaged to about 1.5mbps (on DTS, or Digital Theater Surround). However despite this higher bit rate, it's nowhere near as good as a CD, which is considered "uncompressed." The reason? Despite a high sampling rate, the bit-depth is lower and the audio is dispersed amongst more channels. In essence, on the highest encode, the bit rate averaged out to 300kbps (although in all accuracy, it would be close to 400kbps, due to the way it's processed). This made me realize that, for the longest while, we really weren't getting the full with music when we listened to it on a DVD. The blu ray on audio/ super-audio CDs are relatively new. This makes me wonder, is the overall quality really worth sacrificing for the "immersive" experience?
Well, that's a questionable factor. The idea of surround sound is to make the music seem more life, like a concert. It also helps separate the tracks, so one can hear instruments in the individual channels. Plus, it IS rather difficult to discern the sound difference at that much of a bit rate difference. However, for some audio purists (and those with better hearing), quality is a deal breaker. It results in certain acoustics being "erased," which results in a duller sound and a lack of a great range in both volume and frequencies. However, with the deteriorating hearing levels of people, one has to wonder if quality is simply a secondary thought now.
I listened to the album and it sounded FANTASTIC. This is no doubt the best I've heard some of this music in quite some time. Generally I've only been able to hear the music on the radio (and because of the limited bandwith, a lot it lost in the process), or go as far as playing their song "Limelight" on the game "Rock Band," which enables one to play the song in Dolby Digital (a codec that enables a person to hear higher quality sound with true surround due to processed audio separation in each of the channels). It sounded great, but despite the cleanliness of the audio, the amount of compression, there is still data lost. According to the information case, it's practically hearing what one hears in the studio, due to increased information being played.
Before I continue, I should probably explain more about the concept of data. Basically, there are several factors that one must know about: the sampling rate (the amount of times something is played per second), the bit depth (how much data is present in each individual sample being played), and the amount of channels being used (or speaker number). This results in the eventual "bit rate," or how much data is being played. The thought of this came to me as I listened to this new, higher resolution mix of the music. The overall bit rate for this music (in surround sound) was 13.8 megabits per second (mbps). Compared to a regular CD in stereo, this is about 10 times the bit rate (which is 1.411 mbps). The reason being is because for the bluray, the sampling rate was increased to 96khz from 44.1 khz and the bit depth was increased to 24-bit encoding to 16-bit encoding. For a CD, this results in about 700kbps per channel. For the blu-ray, it averages out to about 2.76 mbps per channel.
The reason I'm mentioning this is because I owned a music DVD as well involving Cirque du Soleil's "Love," involving a remix of some of the music of The Beatles. The music was on a DVD and did surround as well. The bit rate averaged to about 1.5mbps (on DTS, or Digital Theater Surround). However despite this higher bit rate, it's nowhere near as good as a CD, which is considered "uncompressed." The reason? Despite a high sampling rate, the bit-depth is lower and the audio is dispersed amongst more channels. In essence, on the highest encode, the bit rate averaged out to 300kbps (although in all accuracy, it would be close to 400kbps, due to the way it's processed). This made me realize that, for the longest while, we really weren't getting the full with music when we listened to it on a DVD. The blu ray on audio/ super-audio CDs are relatively new. This makes me wonder, is the overall quality really worth sacrificing for the "immersive" experience?
Well, that's a questionable factor. The idea of surround sound is to make the music seem more life, like a concert. It also helps separate the tracks, so one can hear instruments in the individual channels. Plus, it IS rather difficult to discern the sound difference at that much of a bit rate difference. However, for some audio purists (and those with better hearing), quality is a deal breaker. It results in certain acoustics being "erased," which results in a duller sound and a lack of a great range in both volume and frequencies. However, with the deteriorating hearing levels of people, one has to wonder if quality is simply a secondary thought now.
Saturday, April 2, 2011
3DS: First big leap without glasses
As I've discussed before, 3D is a technology that has been bouncing back and forth in popularity since the 50s. However, it wasn't until fairly recently that the technology began to evolve. What started off as dual colored glasses (red/ cyan), it evolved into polarized lenses (image is adjusted to the glasses) and active shutter glasses (visibility of an image is alternated between eyes at 60 frames per second for each eye), which kept the original image in tact. Now, with the 3DS, Nintendo's newest handheld, the viewer can now see 3D images without glasses.
The Nintendo 3DS utilizes a technology that is still being worked on called auto-stereoscopy, which is a source of passively viewed 3D that is generated on actively. Two screens are used to generate the image. The reason the eye is able to discern them is due to something called a "parallax barrier," which helps to merge the images by blocking the the opposing screens from the eyes, allowing each eye to perceive the images differently.
This does two things in comparison to traditional projected 3D image. This not only widens the image, showing more (think of is as 16:9 widescreen vs. a Pan and Scan 4:3 screen ratio), but it also enhances the overall definition and resolution of the image, making high resolution and clearer. There is, however, a drawback with this technology. Due to the way screen and barrier are displayed, the viewing angle is EXTREMELY limited. Unlike more traditional 3D, which enables you to see 3D images at almost any angle you view it at, One has to look at the image straight on to get the full 3D effect. Although one can alter the image vertically without affecting it, if one alters the image horizontally, it completely ruins the effect (making the images uneven or visibly separated). The second issue is that it's also more difficult to view the image over longer periods of time in comparison to watching a 3D image with glasses. While one is able to watch an entire movie with glasses, one can begin to suffer eyestrain at around 30 minutes on average with auto-stereoscopic viewings.
While the viewing seems considerably more organic with auto-stereoscopic viewing, the stress and narrower viewing angles seem to make less more enjoyable of an experience in comparison of the use of glasses, as it's easier on the eyes in the long run. Granted, it is still a technology that is still early in development. In terms of experimentation, it is fascinating technology. No doubt that as time progresses, the technology will evolve and become better, creating less strain and a better viewing angle for viewers.
Subscribe to:
Posts (Atom)