Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:VR is undeniably the future. (Score 1) 115

Yep. It's still pretty rough around the edges, but it's absolutely here. There are VR ports for DOOM based games (Doom, Doom2, Hexen, etc), Quake based games, and Duke Nukem3D, and some new and fairly serious VR titles are starting to show up. Those old games were *hoping* for VR to take off 20 years ago, and they are awesome now that it's available. Games *actually* designed for current VR are starting to gel. Orbus VR (a VRMMO) is amazing. VR with head tracking is awesome, but once you get touch in there, it's a whole different level of awesome. When you have hands in VR, it's substantially more immersive.

You need a pretty burly video card to use VR, and for Oculus CV1 specifically one that can handle HDMI 1.3 for 1440x1200@90fps, which means you need a pretty recent video card, ideally a high end one. The dev kits will work on older hardware though... I just picked up several more of the DK2 dev kit for cheap on Ebay so I can host a VR LAN party....they don't support touch but for old shooters they will rock it.

VR is drafting off the fact that cell phone screens (the main component) are now a commodity item that is mass produced and has phone vendors doing all the R&D to shrink the screens and increase the resolutions. The price is going to drop as it becomes more mainstream. There were nifty VR headsets 20 years ago, but the display tech sucked and was too expensive... now it's good enough, and *cheap enough*, and comfortable enough to work as a product. It's still going to iterate fast though...current headsets are going to be superseded fast as VR takes off.

Comment Rift is fantastic. (Score 3, Insightful) 413

There aren't a lot of apps yet, but Oculus Rift really is fantastic. It runs OK on relatively old gaming hardware... I have both the DK2 and the consumer release version with touch controllers. It certainly runs better on a NVidia 1070 (or better), but it ran just fine on my previous card, a GTX480, which is below minimum requirements technically. It also runs pretty well on a gx980(which is a mobile gpu for laptops). Everyone whom I have demonstrated my rigs to wants one (and my gaming rig is a 4 year old i7 box, outside of having a 1070 recently upgraded). Given that the Rift is mostly cell phone screen, plus a few electronics, and some cameras, $400 is what the price point should have been at launch. You need a beefy system, but it definitely doesn't have to be a crazy rig.

Having tried the HTC Vive, and owning some older VR equipment (Vuzix stuff), it's pretty clear to me that the Rift is the best VR equipment I've used. It's more responsive, and looks better (higher res screens). Other stuff I've tried tends to be kind of laggy with respect to head tracking and motion controllers, but not the Rift.

Elite Dangerous in VR is a fantastic experience, as is In Medium (3d sculpting). I've also run a fair number of older titles through VireioX with good success; Skyrim is pretty cool in VR. Even my 80 year old parents like many of the experiences. Viewing 360 degree photos of sites in Egypt in VR is amazing.

VR games are _different_ from existing games though.. the VR experience is more intense, and more immersive, and some things that worked on 2D screens don't work well in VR... for example, the speed you run at in most FPS games can be 50-70 MPH, which in VR makes you want to hurl, because running that fast through corridors is kind of disorienting. I also tend to be sated with VR after a much shorter gaming period...the experience is just more intense. I believe VR is here to stay, but it will take a while for developers to discover what works well in game experiences.

Comment Stratus has proprietary redundant *everything*. (Score 5, Informative) 137

Stratus has proprietary redundant *everything* on their machines, and runs in lockstep; they literally have two of everything in there... two motherboards, two cpus, two sets of RAM, etc. If anything weird happens on one side, they fail over to the other motherboard running in lockstep on the other blade in the chassis. Combine that with running an extremely conservative set of drivers that are known stable, and you can get six nines out of the thing. Stratus is typically used for credit card processing and banking applications where it's not ever acceptable to have a machine down for the time it takes to reboot. Really, really, really expensive though. You wouldn't want to use one of these for anything normal.

Comment It's the content. (Score 1) 274

It's the content holding back the revolution. That, and printers are still pretty crappy and still improving quickly. Making original models from scratch has a steep learning curve...it takes skill to use 3D modelling software. Handheld 3D scanners aren't cheap enough or good enough to make it worth having a 3D printer for the average toaster user. (and even with a good scanner, you have to clean up the model and modify it for printing). So, while 3D printers are fantastic if you have the skills to use them, it's not easier or cheaper for an ordinary person to use one versus just buying a part produced by someone else. And that's not even getting into the nuances of different print materials your printer can use versus the quality industrially molded plastic or other processes put out for less cost. I personally use my printer to make wargames terrain (toys), but a) I have the skills, and b) wargames stuff is expensive enough that it's often (but not always) cheaper or easier to print stuff, and I can customize the prints. It's kind of like saying that CNC router tables and/or laser cutters are the bomb for woodworking. Yep, they are great, but at the end of the day it's just another tool with a relatively specialized use in crafting. When we have a process that can scan and print copies of an object, at 1 micron resolution, at the push of a button, in under an hour, then it'll really start taking off. Because until that point, it's not really consumer ready.

Comment Bullshit! (Score 1) 199

Bullshit.

Net neutrality is supposed to protect the users from the networks throttling data users want in favor of data the networks want. (i.e. ads having higher priority than real-time gaming or web traffic, because the network gets paid more for ad traffic for example). But a user preference system does not solve that problem because the user can only specify preferences for his own little segment of the network, namely, the uplink between him and the provider. The network providers can still throttle traffic all they want and the user's preferences don't mean diddly in the big picture. Even if the user's preferences are set and somehow considered in routing on the wider internet, basically everybody would mark all their packets as high priority, so it's the same as having no priorities. Unless high priority packets cost more. Which is exactly what net neutrality is trying to block from happening. Suppose you use Bit Torrent. If you set preferences to prioritize Bit Torrent traffic on your uplink, but the networks throttle Bit Torrent traffic on the trunk lines and backbone, guess who wins? All other traffic. Your preference settings are irrelevant in the big picture because the performance is dominated by the load and traffic shaping on the other networks your packets travel though. Although sure, you could prioritize certain traffic over other traffic on your internal network and outgoing on your uplink. Oh wait, you already can, in almost all consumer grade network routers....if you look in the options they allow you to set QoS and prioritize certain types of traffic. But none of them can control what happens on the uplink side or out in the network past your provider.

Comment Too late. (Score 2, Informative) 271

Cryptocurrency is kind of like a Ponzi scheme.

I thought about buying some dedicated bitcoin mining hardware a couple years ago. By the time you factor in cost of electricity, probability of mining a valid coin, and the mechanism by which bitcoin increases the difficulty of mining every time a number of new coins are found, it's a losing proposition as it takes an ever increasing amount of compute power to find each new coin.

Unless you are mining bitcoin via a botnet and you are stealing the compute and electrical resources used by the mining, it's less than break even on average, at least from a pure production standpoint.

Comment Re:I have a serious question.... (Score 2) 66

It's built into the design of bitcoin. It automatically adjusts the difficult of mining up (never down) based on the rate at which coins are mined in heats. Every time a faster ASIC for mining comes out, the difficulty shoots up correspondingly. At best you can mine a good percentage of the blocks in the current heat with a new machine before the difficulty shoots back up and the new ASIC performance is the new baseline The new asics are still better than the poor schmoes running regular cpus or gpus, but it's quite difficult to "get ahead" with mining unless your mining resources are free, such as harvesting cpu cycles form a botnet.. There are a finite number of bitcoins, and with every block of them found the difficulty to mine the next batch goes up very fast.

I recently thought about getting some ASIC mining hardware, but after running the numbers and factoring the cost of electricity, and the current price of bitcoin, it was dubious whether the devices would even make back their cost even if they did ship on time and perform as advertised. Even just running mining software on cpus and gpus I already own is a losing proposition due to the cost of electricity. Not really a good investment, unless you're on unmetered power, such as college dorms.

The way that that the mining difficulty cranks up with blocks returned, the developers of any new fast ASIC hardware will reap the greatest benefit of the faster hardware during development, and by the time you get it in your hands the bitcoin ecosystem will have already cranked up the difficulty. It will still be faster than older hardware, but since the difficulty cranked up too, it's likely going to produce much less than you initially expected.

Comment CS is not programming (Score 2) 546

Computer Science is largely very specific applied math and theory. It includes algorithms, algorithm efficiency, a bunch of math, data structures from a theoretical design standpoint, and computer architecture. It tends to be very academic.

University programs vary widely on what the programs focus on, but generally Comp Sci is about the math and theory, and programming is something you do on the side to get the assignments done to illustrate the theory you are learning. With Computer Engineering and Software Engineering programs, things tend to be more hands on and focused more on doing than theory.

Programming, as desired by business, is NOT computer science. Business wants the most simplistic designs (i.e. always use linked lists instead of more appropriate data structures), and above all, they want you to code whatever it is FAST FAST FAST so you can SHIP SHIP SHIP. Because generally, most business are not software businesses, and they don't value developers or software beyond getting the minimum quick and dirty solution out of them as fast and as cheap as possible. Also, most business are not doing anything remotely resembling state of the art, and value the ability to hire a newbie to replace you.

CS grads have it rough. They know too much theory to be satisfied with basic programmer jobs, but they don't know enough about efficiently slapping out code day in and day out to have an easy time in a basic programmer job. The degree can get you in the door though. A lot of places filter out folks with no degree.

Not that there aren't some grads who still can't code their way out of a wet paper bag.

There's all sorts of stuff about programming that you will never learn in a CS program, such as when to select designs based on implementation risk and ease of maintenance, rather than algorithm efficiency. It sucks, but the people who pay for you to write the software could not give two shits about how well the code is designed as long as it mostly works and ships on time. For the most part, that CS theory is mostly only directly relevant to later in your programming career, and when you actually have some autonomy to "do it right" versus "do it yesterday", or if you strike out on your own.

Comment Re:In the US, business doesn't care. (Score 1) 201

If they require a master's or PhD, it's not an entry level position.

They either a) are trying to change the world with new or hard stuff and want a theory guy to guide things or b) don't know what they are doing or c) don't want to mess around with kids straight out of school who haven't figured out the corporate metagames and "git'er done" culture yet.

There's the optimal implementation on paper, given infinite time for implementation, and there's the "We have two weeks, do what you can pull off" implementation that business is usually looking for. Business values programmer time more than academia does. I know my CS degree didn't prep me for that very well.

Actual raw engineering is a bit less wild wild west than software... there are legal definitions of what a certified engineer is responsible for; i.e. if people die as a result of your engineering mistakes, it's your fault, not just some edge case bug. But the same corporate BS is still driving it, so the same stuff applies... HR is still about risk avoidance, it's just that a guy with a master's or PhD had to jump through more hoops to get to the table and thus the wheat is seperated from the chaff so to speak.

Business doesn't care about getting the best candidate, they care about getting the guy who looks like he's good enough for the money they are willing to spend on him and won't end up as a disaster. And also, some of those job postings may require a master's or PhD so they can legally justify hiring an H1-B after there is no one "qualified" to be found.

Comment In the US, business doesn't care. (Score 3) 201

Business (HR specifically) doesn't give a shit about your degree. They care about a) that you have the checkbox, b) who you worked for previously and are not lying about it, and c) whether it looks like you aren't a total fuckup who will cost them. It's about risk avoidance.

The actual team you interview with (if it wasn't an HR drone) cares that you look like you know your shit and can carry your weight.

Engineering and especially computer degrees are such a total crapshoot on the skills you get in a candidate, that they don't know how to weigh your degree. Even degrees from badass schools sometimes come with folks who still can't code their way out of a wet paper bag. Besides, most of that senior level theory stuff in the degree won't help you much in a real world job until the late stages of your career, and will piss off your peers who don't have the same background, and definitely piss off management, who barely understands what a linked list is.

The quality of in person versus remote will depend on your learning style, and whether you actually would make use of those in-person office hours anyway.

Comment Most people don't understand that it's a bad idea. (Score 5, Informative) 405

Look, digital electronics are still subject to analog limitations. When you overclock, you squeeze the hysterisis curve, increasing the probability that your chip incorrectly interprets its the state of a particular bit as the opposite value. i.e. you get random data corruption. This is why you eventually start crashing randomly the more you overclock.

While overclocking a chip that has been conservatively binned simply to reduce manufacturing costs but is actually stable at higher clock rates is reasonable, trying to overclock past the design limits is pretty insane if you care at all about the data integrity. Also, you tend to burn out the electronics earlier than their expected life due to increased heat stress.

I never overclock.

Comment Outside virtualization, it won't work well. (Score 1) 239

Most devices barely work in one operating system, let alone having to deal with being initialized and controlled by multiple driver models and switching back and forth between them hot.

They are simply not designed for that scenario. Hence, the hypervisor, and virtualized devices under it.

Comment $4 for 5 cent parts isn't going to motivate DIYers (Score 1) 413

Radio Shack has been a ripoff for years. Why the hell would anyone who knows enough to DIY pay $4 for a 5 cent part? Sure, it might take a few days for it to come from Mouser, but honestly when you're designing a circuit, you need a lot of components, generally plan out what you need in detail and a retail place just isn't going to stock whatever exotic parts you are going to need for your project anyway.

Since there are far more folks who aren't with it enough to DIY, Radio Shack is far better off overcharging the masses for extension cords, sub par computers, and low grade RC cars in the mall. They just want the masses to THINK that smart people shop there.

Comment I went to WWU, and have a CS degree... (Score 3, Informative) 298

Western's CS program is one of the ones that grew out of a math base. It's pretty hardcore on the theory, but you're sort of on your own for learning the stuff that business wants. Which is fine.... even if the program focused on exacty whatever buzzwords corps want these days, corps don't generally hire CS grads straight out of school. The stuff you learn in the 400 level classes is great for senior developers to know....but you're not going to start out as one. It wasn't till my 3rd job out of college (which I'm still at) that I actually got to touch source code at work. For long term personal growth, I'm really glad that I had my ass kicked with the theory; I find that the rigorous methods that were drilled into me really help me tackle the hard problems I work on every day. (debugging nasty kernel mode race conditions in code written by others for example). Besides, if you can handle the proofs and algorithm stuff, you can handle anything else, though you'll sure as hell not enjoy writing silly business apps over and over.

You know what the job finding foks at Western tell you about finding a job once you graduate? They tell you to forget about finding anything remotely in your field. The real difficulty in getting hired after college has less to do with your skills and what you're taught and more to do with risk aversion for employers...they don't like hiring green kids who don't understand corporate politics yet. You have to persevere in order to get to do what you love.

Computer science is supposed to be hardcore...unfortunately there is a huge variation in what different universities consider to be computer science, let alone what the business world thinks. For some, any old programming is CS, for others, they focus on software engineering methods, and some hardly touch on theory and math at all; others still consider web page design to be CS. CS is about understanding the extreme limits of what computers and software are capable of and pushing the limits of what's possible....it's not supposed to train you for "IT" (which most businesses consider to be the guys that fix their computers).

You really should not be doing a computer science degree unless you are going to be some kind of developer and you get off on things that require in depth knowledge of how to design and compare the performance of different algorithms, want to fix bugs no one else can, want to write really hardcore software (such as doing speech recognition, computer vision, or 3d rendering) at the bleeding edge, and need to be able to prove why your design is better than someone else's design. The industry is already full of very experienced, very compentent people who don't have CS degrees. In fact, many of them started before such degree programs even existed. They know how to code, but they generally don't have any exposure to the more advanced theory stuff and are therefore not inspired by it, nor do they generally value it. The degree is MUCH more a long term investment for your career than a credential to get your foot in the door, as you'll eventually get to apply the theory and start doing things that wow. After you've taken your lumps that is.

Slashdot Top Deals

UNIX was half a billion (500000000) seconds old on Tue Nov 5 00:53:20 1985 GMT (measuring since the time(2) epoch). -- Andy Tannenbaum

Working...
OSZAR »