Wikipedia:Reference desk/Archives/Computing/2010 August 17
Computing desk | ||
---|---|---|
< August 16 | << Jul | August | Sep >> | August 18 > |
Welcome to the Wikipedia Computing Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
August 17
editHardware Accelerated AES Encryption
editHow does hardware accelerated encryption work? The processor performs the same algorithm whether it's done by the hardware or software, so how is it that hardware accelerated is faster? [EDIT:] I have a Intel Core i5 processor, just so you know. --Yanwen (talk) 00:42, 17 August 2010 (UTC)
- See Parallel computing. By offloading work to a specialized hardware unit, the CPU is free to do other work, while a peripheral device computes the hash. This can actually speed up end-to-end processing for a workflow (even if the actual calculation of the hash is slower than it would have been on the CPU). It is also possible that the specialized peripheral uses some hardware, like vector processing or SIMD, to compute the hash in fewer clock-cycles than a general-purpose CPU. In either case, offloading the work from the CPU can also increase the throughput capability of the system (which is a different performance-metric than single-job execution time). The exact speedup or throughput improvement depend entirely on the characteristics of the workflow and the load. For a highly-utilized server that computes thousands of hashes per second, such accelerators are probably a good value; for a personal computer, you would rarely see any worthwhile benefit. Nimur (talk) 00:54, 17 August 2010 (UTC)
- So there is a separate hardware unit just for computing the hashes and other AES operations... How is this different from your typical multi-threaded application? Wouldn't you get the same performance boost by using more threads? --Yanwen (talk) 01:24, 17 August 2010 (UTC)
- Threads have to be executed on hardware. Creating more threads does not help unless there is idle hardware available to execute the thread. As for why you would want to use specialized cryptographic hardware instead of just adding more general purpose cores, there could be two reasons. (1) If you know exactly what the hardware is going to be doing, you can make it faster and smaller than general purpose hardware. (2) Cryptographic keys can be stored in tamper-resistant hardware so the unencrypted data goes in, the encrypted data comes out, and the general-purpose processors do not have access to the keys. Jc3s5h (talk) 01:55, 17 August 2010 (UTC)
- Is this feature only available to Core i3 and above? How about Core 2 Duo? Is specialist software needed? --Tyw7 (☎ Contact me! • Contributions) Changing the world one edit at a time! 03:31, 17 August 2010 (UTC)
- AES instruction set lists the processors that support it. There are no Core 2 Duos or Core i3s on the list, and a couple of Core i5s are excluded also. Software needs to be rewritten to use the new instructions. The article lists some programs that have been updated. -- BenRG (talk) 08:57, 17 August 2010 (UTC)
- Is this feature only available to Core i3 and above? How about Core 2 Duo? Is specialist software needed? --Tyw7 (☎ Contact me! • Contributions) Changing the world one edit at a time! 03:31, 17 August 2010 (UTC)
- Threads have to be executed on hardware. Creating more threads does not help unless there is idle hardware available to execute the thread. As for why you would want to use specialized cryptographic hardware instead of just adding more general purpose cores, there could be two reasons. (1) If you know exactly what the hardware is going to be doing, you can make it faster and smaller than general purpose hardware. (2) Cryptographic keys can be stored in tamper-resistant hardware so the unencrypted data goes in, the encrypted data comes out, and the general-purpose processors do not have access to the keys. Jc3s5h (talk) 01:55, 17 August 2010 (UTC)
- Algorithms implemented directly in silicon are faster. Imagine your microprocessor didn't have a built-in instruction for integer addition, and you wanted to perform that operation on the contents of a couple of registers. You would have to implement an adder manually using primitive operations the processor did support. Each time you take the AND or XOR of registers A and B and store the result in register C, the processor has to decode the instruction, check that the previous operations writing to A and B have completed and wait for them if not, dispatch the values to the appropriate execution unit, and deliver the result to any pending instructions waiting for the write to C. The bookkeeping takes orders of magnitude more time than the logical operation itself. You'd probably need ~100 instructions and 50–100 cycles to add two 32-bit registers by this process, and multiplication would be far worse. In contrast, if there are ADD and MUL instruction implemented in silicon, the processor just has to get the source registers once, send the bits to the silicon gates, and get the result from the other end. The intermediate bits flow directly to the gates implementing the next step, without any of the overhead. That's why AES is faster with specialized instructions. AES makes use of operations, such as finite field multiplication, that have to be laboriously simulated on most microprocessors.
- Parallelism is not really the issue. Depending on how Intel implemented AES, it might be possible to run an AES computation in parallel with operations like integer multiplication, but few applications are going to bother to do this. Maybe if you were simultaneously encrypting and computing a cryptographic hash of a message you could write a single function that did both and save some time. But this definitely isn't what gives you the speedup on raw benchmarks of hardware vs software AES. -- BenRG (talk) 04:17, 17 August 2010 (UTC)
- Algorithms implemented in silicon are only faster than software if they are faster. (This is a truism, but BenRG's above statement is propagating a common misconception that hardware accelerators are somehow providing "magic speed-boosts"). There are loads of examples where a hardware-accelerator unit is actually slower than a CPU performing the same task. It depends on the implementation and the task. One problem with specialized hardware units is that they are rarely using the latest-and-greatest in CMOS process technology - so they can't be clocked as fast as an Intel CPU. (See these Discretix AES cards at 220 MHz). In that case, they are only faster at executing tasks if their instruction-set of optimized operations-per-clock outweighs their poorer clocks-per-second rate, compared to the CPU. But there are lots of reasons why a "slow" hardware accelerator might still be "better" - it might consume less power; generate less heat in a data center; increase throughput; reduce high-level software network transactions; it might be cheaper to purchase/operate according to a "dollars-per-million-transactions" analysis; and so on. And, in many cases, because it is specialized for a few important operations, a hardware accelerator does actually decrease end-to-end processing time. As far as parallelism, I can think of a perfect example - in fact, probably the most common example - let the CPU process data, and then encrypt it. If the CPU needs to time-share between processing- and encrypting, it is slowed down significantly (by an amount calculated with Amdahl's law); but if the CPU can spend 100% of the time processing, and then pipe data to an encryption accelerator, you have created a deep pipeline, improving throughput. Nimur (talk) 18:18, 17 August 2010 (UTC)
- I was talking specifically about the AES instructions in newer Intel CPUs. I think the original poster was specifically interested in those instructions, though I see now that the original question you replied to doesn't say anything about that. So I think we're both right. -- BenRG (talk) 19:45, 17 August 2010 (UTC)
- Ah, yeah. I wasn't even thinking about on-chip acceleration; I think the OP clarified to mean specifically the Intel AES instruction set; details are provided at the official Intel site. I have a feeling those extensions will put a lot of encryption-peripheral-manufacturers out of business... Nimur (talk) 21:03, 17 August 2010 (UTC)
- I was talking specifically about the AES instructions in newer Intel CPUs. I think the original poster was specifically interested in those instructions, though I see now that the original question you replied to doesn't say anything about that. So I think we're both right. -- BenRG (talk) 19:45, 17 August 2010 (UTC)
- Algorithms implemented in silicon are only faster than software if they are faster. (This is a truism, but BenRG's above statement is propagating a common misconception that hardware accelerators are somehow providing "magic speed-boosts"). There are loads of examples where a hardware-accelerator unit is actually slower than a CPU performing the same task. It depends on the implementation and the task. One problem with specialized hardware units is that they are rarely using the latest-and-greatest in CMOS process technology - so they can't be clocked as fast as an Intel CPU. (See these Discretix AES cards at 220 MHz). In that case, they are only faster at executing tasks if their instruction-set of optimized operations-per-clock outweighs their poorer clocks-per-second rate, compared to the CPU. But there are lots of reasons why a "slow" hardware accelerator might still be "better" - it might consume less power; generate less heat in a data center; increase throughput; reduce high-level software network transactions; it might be cheaper to purchase/operate according to a "dollars-per-million-transactions" analysis; and so on. And, in many cases, because it is specialized for a few important operations, a hardware accelerator does actually decrease end-to-end processing time. As far as parallelism, I can think of a perfect example - in fact, probably the most common example - let the CPU process data, and then encrypt it. If the CPU needs to time-share between processing- and encrypting, it is slowed down significantly (by an amount calculated with Amdahl's law); but if the CPU can spend 100% of the time processing, and then pipe data to an encryption accelerator, you have created a deep pipeline, improving throughput. Nimur (talk) 18:18, 17 August 2010 (UTC)
Computer -> TV
editI have a computer, want a TV, and don't want to buy a DVD player (I used to just watch DVDs on my laptop, but the screen is a tad small). If my laptop is the Apple macBook Pro from mid2009 and I buy the adapter to HDMI, can I plug into HDMI in my TV and watch movies and stuff on my TV from my computer.--173.58.234.169 (talk) 03:05, 17 August 2010 (UTC)
- Most computers have an HDMI port. Even my Dell Studio 15 have one. If your computer have that port, just buy an HDMI cable and connect the computer to the TV that ALSO have the port. Note: Your TV have to have an HDMI port (found on most HD Tv's not old/cheap ones) as well for this to work. Even though your computer have an HDMI port or you buy an adapter, as long as your TV doesn't have the port, all your efforts are futile. The easiest way is to get a cheap DVD player (about £10-£20) which works with the standard RGB ports or the extremely old SCART port that is found on most TVs (even very old ones..or at least most very old ones I've seen). --Tyw7 (☎ Contact me! • Contributions) Changing the world one edit at a time! 06:55, 17 August 2010 (UTC)
- MacBook Pros do NOT have HDMI ports. I am 100% certain about this. I would need an adapter for whatever the thing is to HDMI and it means I might need to run it through another adapter too if that makes sense. So the question is, is using adapters feasible knowing Apple does not include HDMI ports?--173.58.234.169 (talk) 03:58, 17 August 2010 (UTC)
- Our Macbook Pro article verifies that no MacBook Pro has HDMI, but Mini DisplayPort instead. Googling macbook pro hdmi yields several adapters that seem to cost around US$5. Comet Tuttle (talk) 05:40, 17 August 2010 (UTC)
- You might have to consider how to get the sound to your TV - "...older 2009 line of MacBooks and MacBook Pros are unable to provide an audio signal through the Mini DisplayPort, and only do so over USB, Firewire, or the audio line out port instead (the April 2010 line of MacBook Pro, however, supports this15)." Astronaut (talk) 09:32, 17 August 2010 (UTC)
- Our Macbook Pro article verifies that no MacBook Pro has HDMI, but Mini DisplayPort instead. Googling macbook pro hdmi yields several adapters that seem to cost around US$5. Comet Tuttle (talk) 05:40, 17 August 2010 (UTC)
- MacBook Pros do NOT have HDMI ports. I am 100% certain about this. I would need an adapter for whatever the thing is to HDMI and it means I might need to run it through another adapter too if that makes sense. So the question is, is using adapters feasible knowing Apple does not include HDMI ports?--173.58.234.169 (talk) 03:58, 17 August 2010 (UTC)
- I don't use HDMI, just the regular VGA (with a MacBook, but really the same difference, albeit with a different converter), and the audio thing is not such a big deal — you just run it out of the Audio Out port. But yeah, all of this is very plausible and super easy to do in my experience. I watch Netflix InstantWatch and so on off of the TV all the time. Just make sure the TV has the right input ports. Most of them these days have VGA as well in my experience. I'm not sure there's any advantage to using HDMI or VGA in this particular situation? The resolution of the monitor mirrors what is on the laptop, which is already high-def. --Mr.98 (talk) 13:33, 17 August 2010 (UTC)
- I am buying the TV in the near future. I also want to use the InstantWatch in addition to DVDs which is why the need for a cord. I would have assumed that quality is lost with regular VGA though and do newer TVs even have it if they have HDMI (I have been so focused on HDMI, I haven't even checked). I find a lot of retailer websites cryptic on what inputs and outputs are in a model that it is difficult. I have to admit going to the store is not a ton better except you can look. As to audio, what kind of cord is needed for that? Thanks.--173.58.234.169 (talk) 18:53, 17 August 2010 (UTC)
- I don't use HDMI, just the regular VGA (with a MacBook, but really the same difference, albeit with a different converter), and the audio thing is not such a big deal — you just run it out of the Audio Out port. But yeah, all of this is very plausible and super easy to do in my experience. I watch Netflix InstantWatch and so on off of the TV all the time. Just make sure the TV has the right input ports. Most of them these days have VGA as well in my experience. I'm not sure there's any advantage to using HDMI or VGA in this particular situation? The resolution of the monitor mirrors what is on the laptop, which is already high-def. --Mr.98 (talk) 13:33, 17 August 2010 (UTC)
- While most retailer websites are pretty thin on the details like exactly which ports are installed (and, I have noticed, they are sometimes wrong), manufacturer websites are a better source of information about their products. In my experience, not many TVs have a separate audio-in, but where I have seen audio-in it has always been via left & right RCA connectors (but maybe that's a European thing). The ideal way would be find a converter box to take video and audio from your MacBook Pro and send it to the TV with one HDMI cable. This MacWorld article, which discusses many of issues, recommends some specific products. Astronaut (talk) 09:10, 18 August 2010 (UTC)
Thanks, your solution seems best for what I am trying to do (the MacWorld one) and should work in the US.--173.58.234.169 (talk) 21:56, 19 August 2010 (UTC)
- When I use VGA on my TV it looks pretty much identical to the input. If it is worse quality or refresh rate or colors or whatever, I certainly can't tell. But I'm not a hi-fi style buff or anything. The resolution is identical on my TV/laptop combination, which is all I am going for (it's higher than the InstantWatch resolution/refresh itself). I'm pretty sure it's pretty standard these days for at least a VGA port and standard stereoplug (e.g. "headphone') input to be available on newer LCD TVs, at least when I went looking at them. (I ended up just buying one at CostCo, after hunting around — they had the best prices, heads and tails above places like Best Buy.) More tricky in my experience is audio output, which is often in an optical plug, which requires you to use compatible speakers or to get some kind of converter. --Mr.98 (talk) 12:53, 18 August 2010 (UTC)
computer file with pdb extension
editI got a computer file with pdb extinsion. I don't know how to open it. Any idea? thank you.124.43.25.100 (talk) 08:17, 17 August 2010 (UTC)
- Is this any use? --Phil Holmes (talk) 09:26, 17 August 2010 (UTC)
- It would help us if you told us what you thought the file was supposed to be. Is it a document someone sent you? And old set of notes? Something you downloaded from a website? --Mr.98 (talk) 13:35, 17 August 2010 (UTC)
- For what it's worth, the .pdb files on my system exist because I use Microsoft Visual Studio. The .pdb file does nothing by itself. Is it in a folder along with some other software? Comet Tuttle (talk) 17:30, 17 August 2010 (UTC)
- This is why it's important to say where the file came from. All the files on my system with a .pdb extention came from the Protein Data Bank and can be opened with a molecular viewing program. These programs, however, would do nothing with Comet Tuttle's Visual Studio file, though. -- 140.142.20.229 (talk) 18:30, 17 August 2010 (UTC)
- We have articles or dab pages on most extensions, such as PDB. ---— Gadget850 (Ed) talk 19:54, 17 August 2010 (UTC)
2 anti virus software running together
editApart from slowing down your computer why shouldn't you use 2 AV softwares together, what happens when you do? Mo ainm~Talk
- It can cause problems because the two (or more) programs aren't necessarily going to be aware of each other and can get in each others way when doing active scanning. Also, when a virus is found it can cause quite a "fight" as to who gets to deal with it. Personally though I use Symantec Endpoint Protection as well as Microsoft Security Essentials and they work together just fine even when something is found, so unless you actually have problems running two programs then I wouldn't worry about it. But assuming both programs are up-to-date then running more than 2 is probably overkill and will degrade your system performance with no real benefits. ZX81 talk 18:51, 17 August 2010 (UTC)
- As virus scanners can contain portions of actual virus code for identification purposes, rival AV suites can actually flag each other as 'rogue' programs when they see the code in their own scan. Exxolon (talk) 20:59, 17 August 2010 (UTC)
- If you want to run two AV engines, there are products who offer that feature. So you would be running one AV product, but with two engines from different manufacturers, which avoids the issue of two AV products detecting each other's virus samples or similar problems.
- If you want to do it manually, you should find out the names of the program directories and add a scanning exclusion for those (install AV #1, add exclusion, deactivate AV #1, install AV #2, add exclusion, activate AV #1). There's no guarantee that it will work, it's just increasing the chances. On-Access scanners might still get in a fight because they hook into file access routines, and basically have to hook into each other if more than one is installed.
- I'd recommend going the several engines combined into one product way if you're looking for more protection than one AV can offer. Or you could install the On-Demand components of multiple AV products, add exclusions for them in the one On-Access scanner you run, drop all your "suspicious" files into one folder and add a batch file that triggers scanning of this folder with all your On-Demand scanners.
- If this is for company use, you could use one dedicated AV scanning computer that runs a different AV than the rest of your machines, and make it a company rule that all media entering and leaving the building must be scanned there. Same goes for Web proxies, Mail servers, etc. - run a different AV on them than the desktop machines. -- 78.43.71.155 (talk) 09:04, 18 August 2010 (UTC)
Need input - writing a basic referrer tracking script
editHi,
I am want to learn more about how Google Analytics works by writing a 'basic' version of it which only tracks the referrer (initially) and the page being visited (its going on a template). I plan to implement the tracker using a 1x1 pixel approach, using the javascript like this one:
<script type='text/javascript'>
document.write("<img src='//my.tracker.url.here/pixel.php?args=" + someVariablesHere +"' width='1' height='1' alt='' /> ");
</script>
for the php side, i came across this php code to generate the actual gif image, and room for me to add the database part. Now some questions:
- Which is more reliable for getting the referrer: Javascript (document.referrer) or PHP ($_SERVER["HTTP_REFERER"])?
- Aside from the referrer, what other information is commonly collected?
- is the syntax for the image source safe "//url.here.com" in terms of implying http/s or should I explicitly code it (detection using window.location.href)?
Open to advice regarding specifics and good practices. TIA PrinzPH (talk) 22:39, 17 August 2010 (UTC)
- You could use a non-expiring tracking cookie with a globally unique id. You should write out the full url...probably using http: vs https. PHP is more reliable than javascript (server is always more reliable than client). Other information that could be collected could be duration of stay, mouse movements, client information (browser, OS, screen size, etc).Smallman12q (talk) 20:43, 18 August 2010 (UTC)
- Thanks for the reply, will try it out! :) PrinzPH (talk) 20:04, 19 August 2010 (UTC)
- You will notice that Google Analytics itself codes for the http/https distinction. I think the issue is that if you are in https and serve up unsecured content on some browsers, they will either block it or give a nasty warning message. --Mr.98 (talk) 15:43, 21 August 2010 (UTC)
Wikia's poor quality of service
editDear Wikipedians:
Is it just me or does anyone else notice that Wikia's quality of service seems really poor.
I started a new wiki today on Wikia, and I noticed these frequent "black-out" periods where the whole Wikia site seems to crash and not respond. I remember Memory Alpha and Uncyclopedia being really reliable in the past, but not anymore.
Anyone else know more about this problem?
Thanks,
70.31.154.183 (talk) 23:38, 17 August 2010 (UTC)
- I stopped going to Wikia sites when they forced that horrible "New Monaco" skin on every wiki and stole uncyclopedias domain name, so I don't know what their service is like currently. But from past experience Wikia has always been quite slow with a lot of timeouts. They probably have an outdated server. 82.44.54.4 (talk) 11:13, 18 August 2010 (UTC)
- Thanks. That about explains it. 70.31.154.183 (talk) 12:31, 18 August 2010 (UTC)
- I don't see how you can claim someone stole the a domain name, when the owner apparently entirely voluntarily transferred ownership. If you're referring to transferring from www.uncyclopedia.org to uncyclopedia.wikia.com as the primary hosting location, that appears to be a seperate issue that happened long after ownership of the domain name was transferred. Nil Einne (talk) 16:42, 20 August 2010 (UTC)