Yeah. In my defense, it was the cheapest option for what I wanted (mobile internet without getting a contract, basically through stealing other people's wifi.) and literally the first thing I did after taking it home was to jailbreak it. Crazy that Apple would be the cheapest (for the 8gb model at least) but there you have it. Strange, having to pay more for open source. Rest assured, when I get a smartphone it'll run Android. This is just a makeshift replacement for my laptop, which died (sorta) and would have cost more to fix.
Anyway, now I can play Final Fantasy VI (the best IMO) in a handheld device that also surfs the internet. Dream come true. And the iPod touch (really all of the IOS devices) is a pretty impressive machine - it has a 1ghz processor, a camera, a really posh touch screen, and an accelerometer, all in one very light, small package. Shame there's no physical keyboard (understandable) and no SD card slot (Borders on extortion to get the bigger models. No, Apple, adding 56 gigabytes to a machine should not double the price) and that it's locked down out of the box. Still, 8gb is cool for me, especially if I actually get around to setting up an FTP server one of these days and streaming all my stuff to the iPod.
Anyway, very nice. and cheap compared to similar offerings. Other than the stuff listed above, it's a joy to use, and really that's what counts.
The Manic Nerd
The random rantings of a semi-insane, highly opinionated wannabe hacker
Monday, March 28, 2011
Wednesday, March 9, 2011
The War on Homebrew
I touched on this in my last post, but I figured it was worth a full one. Also note that I have no actual experience with homebrew on any console but the Wii, so everything else is just hearsay.
Anyways, one of the reasons that an Xbox 360 can play, say, Fallout 3, while my computer, which actually has somewhat better stats all-around (except perhaps the GPU, which is rather difficult to measure exactly) cannot is optimization. Optimization is the idea that if one has a standardized set of hardware, one can squeeze far more power and stability out of it than if you had a lot of different and often customized sets of hardware of roughly equal power. Thus, one is either required to have hardware that in actuality can do a lot more than it ever does, but cannot because nobody ever writes programs specifically for that device, or you can have a lot of identical or nearly-identical units with code written specifically for that machine.
Unfortunately (or rather fortunately, for other reasons which somewhat override this complaint) the PC market is a free-for-all, with many different vendors selling many constantly-changing computer models. Also, the PC gaming crowd tends to make their own machines or upgrade existing ones, making machine-specific software utterly impossible. Again, this is a good thing for many reasons, but optimization is not one of them.
Home consoles are the other end of the spectrum. Every single one of them is exactly the same, barring minor changes such as the recent model of Xbox 360 or a certain hidden change to the Wii that I'll get back to later. Unfortunately, they are also closed platforms - not only do you have to pay money for the privilege of developing software for them (which must then be approved by the vendor) but from what I hear you also have to know the right people.
Thus, homebrew. Unauthorized software made by home users in order to capitalize on optimization. Security vulnerabilities are discovered in the operating systems of the consoles in question, which are then widened by software such as The Homebrew Channel, much as a small crack widens in an asphalt road. Predictably, the legal status is somewhat hazy.
Vendors do not like this. Not at all. After all, they paid their programmers good money to lock down what they believe to be their systems. Naturally, the problem must be eradicated. This, not performance or stability, is what drives console makers to update. First, there was The Twilight Hack. Then the original Bannerbomb. Not once has Nintendo added any useful feature or increased either the stability or the performance of the Wii. Later software runs fine on un-updated Wiis, provided it can be made to run without updating the Wii.
Microsoft is just as bad, locking users out of Xbox Live for the crime of using their own hardware. And Sony... if I get going on Sony, you'll be here all day. Long story short, console vendors use updates to trick users into locking down their systems further under the guise of improving them. And not just software updates. You think the only reason Microsoft changed the Xbox 360 was to make it look cooler? A similar thing happened to the Wii, but without all the fanfare. Wiis made after a certain date make a certain, crucial part of the NAND related to booting the Wii non-rewritable. This was because hackers had figured out ways to overwrite it with software that made it possible to fix the Wii should the rest of the NAND be corrupted somehow. In other words, Nintendo intentionally made the Wii less stable, not more.(Did I mention that Nintendo's own updates can sometimes be the source of said corruption?)
Yes, the ability to run unauthorized code can and does sometimes lead to piracy. And that sucks. That said, the major Wii hacking sites have a ban on any software that could be used to pirate official Wii games. the idea isn't to pirate, it's to take advantage of the wonderful hardware that the user paid for with their own money.
Anyways, one of the reasons that an Xbox 360 can play, say, Fallout 3, while my computer, which actually has somewhat better stats all-around (except perhaps the GPU, which is rather difficult to measure exactly) cannot is optimization. Optimization is the idea that if one has a standardized set of hardware, one can squeeze far more power and stability out of it than if you had a lot of different and often customized sets of hardware of roughly equal power. Thus, one is either required to have hardware that in actuality can do a lot more than it ever does, but cannot because nobody ever writes programs specifically for that device, or you can have a lot of identical or nearly-identical units with code written specifically for that machine.
Unfortunately (or rather fortunately, for other reasons which somewhat override this complaint) the PC market is a free-for-all, with many different vendors selling many constantly-changing computer models. Also, the PC gaming crowd tends to make their own machines or upgrade existing ones, making machine-specific software utterly impossible. Again, this is a good thing for many reasons, but optimization is not one of them.
Home consoles are the other end of the spectrum. Every single one of them is exactly the same, barring minor changes such as the recent model of Xbox 360 or a certain hidden change to the Wii that I'll get back to later. Unfortunately, they are also closed platforms - not only do you have to pay money for the privilege of developing software for them (which must then be approved by the vendor) but from what I hear you also have to know the right people.
Thus, homebrew. Unauthorized software made by home users in order to capitalize on optimization. Security vulnerabilities are discovered in the operating systems of the consoles in question, which are then widened by software such as The Homebrew Channel, much as a small crack widens in an asphalt road. Predictably, the legal status is somewhat hazy.
Vendors do not like this. Not at all. After all, they paid their programmers good money to lock down what they believe to be their systems. Naturally, the problem must be eradicated. This, not performance or stability, is what drives console makers to update. First, there was The Twilight Hack. Then the original Bannerbomb. Not once has Nintendo added any useful feature or increased either the stability or the performance of the Wii. Later software runs fine on un-updated Wiis, provided it can be made to run without updating the Wii.
Microsoft is just as bad, locking users out of Xbox Live for the crime of using their own hardware. And Sony... if I get going on Sony, you'll be here all day. Long story short, console vendors use updates to trick users into locking down their systems further under the guise of improving them. And not just software updates. You think the only reason Microsoft changed the Xbox 360 was to make it look cooler? A similar thing happened to the Wii, but without all the fanfare. Wiis made after a certain date make a certain, crucial part of the NAND related to booting the Wii non-rewritable. This was because hackers had figured out ways to overwrite it with software that made it possible to fix the Wii should the rest of the NAND be corrupted somehow. In other words, Nintendo intentionally made the Wii less stable, not more.(Did I mention that Nintendo's own updates can sometimes be the source of said corruption?)
Yes, the ability to run unauthorized code can and does sometimes lead to piracy. And that sucks. That said, the major Wii hacking sites have a ban on any software that could be used to pirate official Wii games. the idea isn't to pirate, it's to take advantage of the wonderful hardware that the user paid for with their own money.
Sunday, February 27, 2011
it's been a while. but this is pretty darn cool.
Yeah, I haven't posted in a long time. No excuse, I've just been lazy. Also, I haven't really come across anything really interesting that I could post about. No more, because the great illuminator that is StumbleUpon has shown me something of highly questionable legality but amazing potential.
I'm talking about Playr: http://www.playr.org/
Playr uses flash based emulators to play old (most Gameboy/Gameboy Color) games. It works decently well under Linux, which means it probably runs better under Windows seeing as how Flash in Linux is so awful. I haven't tested it under Windows yet. Whats really intriguing is that it allows you to log in and keep save states of the games you play. In other words, it's cloud-based emulation, and it doesn't matter what your OS is. You could even play it on the Wii's browser (of course, it wouldn't work well since the Wii has even worse flash than Linux)
This is, of course, highly suspect under certain IP laws, at least in the United States (Playr is hosted in Sweden, according to Wolframalpha). Actually, I'm almost afraid to call attention to it, or at least I would be if anyone read this blog. Nintendo in particular is absolutely vicious about its Copyrights (as well as DRM. Nintendo basically declared war on Wii homebrew, and apparently the 3DS will download updates automatically without the user's permission any time it's in wifi range) and will almost certainly sue, if Playr becomes well-known enough. Even if Nintendo can't win the case, the legal fees alone could drive it under.
Still, I can't help but hope that this site will flourish. And if it does, we'll have quite a service to use in the future.
I'm talking about Playr: http://www.playr.org/
Playr uses flash based emulators to play old (most Gameboy/Gameboy Color) games. It works decently well under Linux, which means it probably runs better under Windows seeing as how Flash in Linux is so awful. I haven't tested it under Windows yet. Whats really intriguing is that it allows you to log in and keep save states of the games you play. In other words, it's cloud-based emulation, and it doesn't matter what your OS is. You could even play it on the Wii's browser (of course, it wouldn't work well since the Wii has even worse flash than Linux)
This is, of course, highly suspect under certain IP laws, at least in the United States (Playr is hosted in Sweden, according to Wolframalpha). Actually, I'm almost afraid to call attention to it, or at least I would be if anyone read this blog. Nintendo in particular is absolutely vicious about its Copyrights (as well as DRM. Nintendo basically declared war on Wii homebrew, and apparently the 3DS will download updates automatically without the user's permission any time it's in wifi range) and will almost certainly sue, if Playr becomes well-known enough. Even if Nintendo can't win the case, the legal fees alone could drive it under.
Still, I can't help but hope that this site will flourish. And if it does, we'll have quite a service to use in the future.
Sunday, November 21, 2010
Well, Sony is still a bunch of jerks.
Remember the rootkit thing? this isn't as bad, but it shows that Sony hasn't changed one bit since.
I was surfing around on Youtube when I decided to look at some of Jibjab's videos. For those who don't know, Jibjab is a company that makes E-Cards and such, and every year/presidential election they do current events parodies. This is some of their best content, and is why I like them so much (and no, that's not a paid solicitation, it's my honest opinion) Anyways, a while back Al "Weird Al" Yankovick teamed up with Jibjab to run a viral marketing campaign for his latest album, entitled "Internet Leaks". Get it? Anyway, some of the songs on the album were intentionally put on the internet in order to drum up sales for the album, which fittingly was digital-only.
Cut to a year later, and I find this:
Sony, it seems, in it's infinite wisdom, has decided to block "CNR", a music video parodying the late Charles Nelson Reilly, and turning him into a Chuck Norris-like figure. A year late.
There is simply no way Weird Al could have put "CNR" on the internet without Sony's permission. Not without having it taken down immediately, at least. They had to have agreed to it at some point. Apparently they changed their minds. Sony still has no integrity. Hardly a surprise, but pardon me for hoping.
The really bad part is the irony. The horrible, horrible irony. It's an album which - as part of the joke - was intentionally leaked to the internet. But even when it's agreed to it, Sony cant abide leaks. It's almost as ironic as that time that a certain ebook provider deleted everyone's copy of "1984" (yes, that "1984") right out of everyone's ebook reader.
It's stuff like this that makes me want to throw my blu-ray player off the Empire State Building.
I was surfing around on Youtube when I decided to look at some of Jibjab's videos. For those who don't know, Jibjab is a company that makes E-Cards and such, and every year/presidential election they do current events parodies. This is some of their best content, and is why I like them so much (and no, that's not a paid solicitation, it's my honest opinion) Anyways, a while back Al "Weird Al" Yankovick teamed up with Jibjab to run a viral marketing campaign for his latest album, entitled "Internet Leaks". Get it? Anyway, some of the songs on the album were intentionally put on the internet in order to drum up sales for the album, which fittingly was digital-only.
Cut to a year later, and I find this:
Sony, it seems, in it's infinite wisdom, has decided to block "CNR", a music video parodying the late Charles Nelson Reilly, and turning him into a Chuck Norris-like figure. A year late.
There is simply no way Weird Al could have put "CNR" on the internet without Sony's permission. Not without having it taken down immediately, at least. They had to have agreed to it at some point. Apparently they changed their minds. Sony still has no integrity. Hardly a surprise, but pardon me for hoping.
The really bad part is the irony. The horrible, horrible irony. It's an album which - as part of the joke - was intentionally leaked to the internet. But even when it's agreed to it, Sony cant abide leaks. It's almost as ironic as that time that a certain ebook provider deleted everyone's copy of "1984" (yes, that "1984") right out of everyone's ebook reader.
It's stuff like this that makes me want to throw my blu-ray player off the Empire State Building.
Friday, October 29, 2010
Linux Mint: Linux even more for the rest of us.
Well, at the behest of a certain critic i decided to install Linux Mint on my aging, but still decent, desktop (2.2 ghz dual core AMD processor, 3 gigs of ram) which had had Ubuntu on it previously. So this thing (I won't call it a review, its more of a fist impressions thingy) will focus on comparing Mint to Ubuntu, which makes sense because it's derived from Ubuntu.
The very first impression I had was that Mint was remarkably minimalistic; it didn't even have a command line! but eventually I realized that it was my monitor flashing the "analog, digital, analog, digital" signals in the corner of my screen, and not the operating system. The video driver had malfunctioned somehow, and even now I have no idea what the problem was, other than my thrice-damned Nvidia chipset.
Now, dealing with a malfunctioning video driver is a much bigger problem than pretty much anything else, since, you know, you have no screen, but eventually I did find a fix which involved altering the startup-command-thingy when the livecd is inserted and then installing Mint, then altering the install so it would do the thing you told the livecd to do on boot. The fix I found was for an older version of Mint (I think) and had to be adopted somewhat to fit. It took me something like three hours, but i got it working and successfully installed the proprietary Nvidia drivers on my otherwise open-source machine
All right, that was something of an unfair criticism. The rest of mint makes up for it, I promise. Also, I blame Nvidia! anyways, my real first impression was that Mint (under GNOME at least) looked remarkably like Windows; even more than Ubuntu. The "menu" button was in the same place as the "start" button, the window buttons were on the right, as opposed to Ubuntu 10.04 LTS, where they're on the left by default, and overall the thing seemed very Windows-like, which is great because the Windows interface is pretty much the only thing, other than compatibility, that Windows has going for it these days.
In contrast, Ubuntu (I don't know about the newest one, I stuck with the LTS) decided to appeal to the Mac crowd and put the buttons on the left, and stuck a lot of primary operating system stuff in the upper-left hand corner, again, like a Mac. This doesn't seem like a great decision by Canonical. Mac users are generally satisfied with their computers and in any case are beholden to Apple. In contrast, Windows users are a very fearful bunch, and rightfully so; their OS is subject to pretty much every virus on the internet, both because of its gigantic market share and because of its flawed permission system (Microsoft tried to fix it with UAC; it didn't work).So, logically, you would want to imitate Windows because most of your converts would be from that, and you would like to ease their course through Linux, which at times can be a very scary place.
Anyways, back to Mint. one of the great thing about mint is that they managed to include a bunch of stuff everyone puts into an Ubuntu install that for legal reasons cannot be shipped with Ubuntu. Flash, Mp3 support, it's all there. How they did this without being sued, I have no idea.
So, that's my first impression on Mint 9, which hopefully i will be using for a long time to come (I am politely refusing to install mint 10 on my computer). its quite nice, and i would think even optimal for scared windows users who want sanctuary from all the scary things that go run DDOS attacks in the night.
The very first impression I had was that Mint was remarkably minimalistic; it didn't even have a command line! but eventually I realized that it was my monitor flashing the "analog, digital, analog, digital" signals in the corner of my screen, and not the operating system. The video driver had malfunctioned somehow, and even now I have no idea what the problem was, other than my thrice-damned Nvidia chipset.
Now, dealing with a malfunctioning video driver is a much bigger problem than pretty much anything else, since, you know, you have no screen, but eventually I did find a fix which involved altering the startup-command-thingy when the livecd is inserted and then installing Mint, then altering the install so it would do the thing you told the livecd to do on boot. The fix I found was for an older version of Mint (I think) and had to be adopted somewhat to fit. It took me something like three hours, but i got it working and successfully installed the proprietary Nvidia drivers on my otherwise open-source machine
All right, that was something of an unfair criticism. The rest of mint makes up for it, I promise. Also, I blame Nvidia! anyways, my real first impression was that Mint (under GNOME at least) looked remarkably like Windows; even more than Ubuntu. The "menu" button was in the same place as the "start" button, the window buttons were on the right, as opposed to Ubuntu 10.04 LTS, where they're on the left by default, and overall the thing seemed very Windows-like, which is great because the Windows interface is pretty much the only thing, other than compatibility, that Windows has going for it these days.
In contrast, Ubuntu (I don't know about the newest one, I stuck with the LTS) decided to appeal to the Mac crowd and put the buttons on the left, and stuck a lot of primary operating system stuff in the upper-left hand corner, again, like a Mac. This doesn't seem like a great decision by Canonical. Mac users are generally satisfied with their computers and in any case are beholden to Apple. In contrast, Windows users are a very fearful bunch, and rightfully so; their OS is subject to pretty much every virus on the internet, both because of its gigantic market share and because of its flawed permission system (Microsoft tried to fix it with UAC; it didn't work).So, logically, you would want to imitate Windows because most of your converts would be from that, and you would like to ease their course through Linux, which at times can be a very scary place.
Anyways, back to Mint. one of the great thing about mint is that they managed to include a bunch of stuff everyone puts into an Ubuntu install that for legal reasons cannot be shipped with Ubuntu. Flash, Mp3 support, it's all there. How they did this without being sued, I have no idea.
So, that's my first impression on Mint 9, which hopefully i will be using for a long time to come (I am politely refusing to install mint 10 on my computer). its quite nice, and i would think even optimal for scared windows users who want sanctuary from all the scary things that go run DDOS attacks in the night.
Monday, October 11, 2010
File Extensions - everything Microsoft loves to hide, or, how viruses work.
Now to be fair to Windows, I love it. I think it's a great, if highly flawed, operating system. I love Linux too, but it's a different kind of love. My biggest gripe with Windows, however, isn't that it's closed source or that it allegedly contains some sort of highly controlling DRM (as near as I can tell, the only DRM in Windows is DRM to prevent (usually unsuccessfully) piracy of Windows itself, which I suppose is fair enough). It's the treatment of file extensions.
Now for those of you who have somehow stumbled upon a tech blog without knowing what a file extension is, well, its at the end of the filename and indicates to the computer what it should do with a file when you click it. If the file extension is, say, .jpg, it opens it in Windows Picture Viewer or whatever else you have set as your default for .jpg files. A .jpg file is a picture file, thus it opens with picture programs. Likewise, if its a .doc or .txt file it opens in a text program, like Microsoft Word or Openoffice.
The reason you, Mr. Joe Enduser, have never heard of this is because Windows, by default, hides file extensions, as if Microsoft actually wants its customers to not have a clue how their computers work (whether this is true or not is another matter). Whoever made this decision should be tarred and feathered, because he is responsible for millions if not billions of dollars of wasted income caused by people not knowing what a file extension is.
But how, exactly, does this default setting waste so much money? Simple. Viruses. See, Microsoft's excuse for this whole debacle is that all files are displayed next to an icon, a small picture indicating loosely what sort of file it is. A text file will have a piece of paper, a picture will have a picture frame, etc. This is all well and good, but the problem is that the icon can be changed. The icon can lie. Usually, what happens is, you get ahold of a file somehow, and the icon looks just like the icon for a picture file, so you click the file to see what it is.
If you run XP, nothing happens. If you run Vista or 7, you get a warning screen which you click through like a zombie, since after all all you're doing is looking at a picture. Then nothing happens. In both instances something does happen, but you can't see it.
Here's what happens. That picture you clicked wasn't a picture at all. it was, in fact, a .exe file. An executable file. .exes are unique in that instead of opening a program, they are a program, just like the web browser you are viewing this in. Only this program is a virus. It implants itself in your computer, creating another .exe and hiding it in your computer, and setting that program to run at startup, hiding in the background. and then it wreaks havok, with symptoms depending on the virus.
Obviously this isn't the only way viruses spread, but it's probably one of the biggest, and it is literally all Microsoft's fault. So change your settings to show extensions!
Now for those of you who have somehow stumbled upon a tech blog without knowing what a file extension is, well, its at the end of the filename and indicates to the computer what it should do with a file when you click it. If the file extension is, say, .jpg, it opens it in Windows Picture Viewer or whatever else you have set as your default for .jpg files. A .jpg file is a picture file, thus it opens with picture programs. Likewise, if its a .doc or .txt file it opens in a text program, like Microsoft Word or Openoffice.
The reason you, Mr. Joe Enduser, have never heard of this is because Windows, by default, hides file extensions, as if Microsoft actually wants its customers to not have a clue how their computers work (whether this is true or not is another matter). Whoever made this decision should be tarred and feathered, because he is responsible for millions if not billions of dollars of wasted income caused by people not knowing what a file extension is.
But how, exactly, does this default setting waste so much money? Simple. Viruses. See, Microsoft's excuse for this whole debacle is that all files are displayed next to an icon, a small picture indicating loosely what sort of file it is. A text file will have a piece of paper, a picture will have a picture frame, etc. This is all well and good, but the problem is that the icon can be changed. The icon can lie. Usually, what happens is, you get ahold of a file somehow, and the icon looks just like the icon for a picture file, so you click the file to see what it is.
If you run XP, nothing happens. If you run Vista or 7, you get a warning screen which you click through like a zombie, since after all all you're doing is looking at a picture. Then nothing happens. In both instances something does happen, but you can't see it.
Here's what happens. That picture you clicked wasn't a picture at all. it was, in fact, a .exe file. An executable file. .exes are unique in that instead of opening a program, they are a program, just like the web browser you are viewing this in. Only this program is a virus. It implants itself in your computer, creating another .exe and hiding it in your computer, and setting that program to run at startup, hiding in the background. and then it wreaks havok, with symptoms depending on the virus.
Obviously this isn't the only way viruses spread, but it's probably one of the biggest, and it is literally all Microsoft's fault. So change your settings to show extensions!
Sunday, October 10, 2010
Phones
Today is an interesting time to have a cell phone. This fact has made the world quite a bit more interesting, since just about everyone has a phone nowadays. See, apparently one day someone said "Hey, if our costumers are all carrying around small computers in their pockets, why not have those computers do more than just take calls?" A few months of engineering later and the first modern cell phone was invented. So then we had cell phones that could play Slots and Crab Catch on their monochromatic screens. Add a few more accessories like a camera and make SMS ubiquitous, and there you go.
Then someone else said "why stop at stupid games? let's make a phone that can access email and surf the internet! and let's add a qwerty keyboard! and thus the smartphone was born. (this is all extremely oversimplified, but I like it that way.)
anyway, this is my phone:
Yes, the picture is taken with a webcam and edited in paint. Get over it. it's a Nokia 2330. As you can the see this isn't exactly the Porsche of phones. in fact, its pretty much the worst phone still sold in first-world nations. But that's fine with me. It works. it has ten megabytes of memory, seven of which are full out of the box, but it works. Anyways, messing with this phone netted me a few interesting facts.
For starters, most Phones share a common type of executable file. specifically, the .jar file, which is apparently Java-based. I was expecting to have to look up stuff for my specific phone, but I was able to get my program (a Tetris clone, since most phone come with a version of Tetris that lasts for about thirty seconds before telling you to buy it I was dying to beat the system, and found a version that was sold for the low, low cost of nothing) onto my phone over Bluetooth (another thing which, like phones, I had zero experience with, but I'm very happy my laptop came with it) and run it without a hitch. Well, on the third try. My phone has such a pathetic processor that only the most minimalistic apps will run on it.
Anyways, this was a pleasant surprise, and an example of how have a little tech skills, creativity, and most of all, dogged thriftiness, can save a lot of people a lot of money.
Then someone else said "why stop at stupid games? let's make a phone that can access email and surf the internet! and let's add a qwerty keyboard! and thus the smartphone was born. (this is all extremely oversimplified, but I like it that way.)
anyway, this is my phone:
Yes, the picture is taken with a webcam and edited in paint. Get over it. it's a Nokia 2330. As you can the see this isn't exactly the Porsche of phones. in fact, its pretty much the worst phone still sold in first-world nations. But that's fine with me. It works. it has ten megabytes of memory, seven of which are full out of the box, but it works. Anyways, messing with this phone netted me a few interesting facts.
For starters, most Phones share a common type of executable file. specifically, the .jar file, which is apparently Java-based. I was expecting to have to look up stuff for my specific phone, but I was able to get my program (a Tetris clone, since most phone come with a version of Tetris that lasts for about thirty seconds before telling you to buy it I was dying to beat the system, and found a version that was sold for the low, low cost of nothing) onto my phone over Bluetooth (another thing which, like phones, I had zero experience with, but I'm very happy my laptop came with it) and run it without a hitch. Well, on the third try. My phone has such a pathetic processor that only the most minimalistic apps will run on it.
Anyways, this was a pleasant surprise, and an example of how have a little tech skills, creativity, and most of all, dogged thriftiness, can save a lot of people a lot of money.
Subscribe to:
Posts (Atom)