Wikipedia:Reference desk/Archives/Computing/2012 November 18
Computing desk | ||
---|---|---|
< November 17 | << Oct | November | Dec >> | November 19 > |
Welcome to the Wikipedia Computing Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
November 18
editGadgets in 8
edit- How can I install a gadget in win8??
- Somebody told me that the gadgest for win7 will still work in win8…
Iskánder Vigoa Pérez 13:51, 18 November 2012 (UTC) — Preceding unsigned comment added by Iskander HFC (talk • contribs)
- I think you need something like 8GadgetPack to run Win7 gadgets in Win8. Trio The Punch (talk) 14:22, 18 November 2012 (UTC)
- since I may find myself in the same position soon enough, what do you mean by gadgets? My Apple wireless keyboard (which works perfectly with Vista)? Wireless mouse? Or something more speccy? IBE (talk) 14:45, 18 November 2012 (UTC)
- Well, its a bit confusing. Microsoft first introduced gadgets in Windows Vista, as part of Vista's Sidebar. In Windows 7 the Sidebar was removed, it enables you to place gadgets directly on the desktop. After a while Microsoft released a security advisory that suggested to disable Windows Sidebar and Gadgets to protect the operating system against security vulnerabilities that exploit the feature. Nowadays (in Win8) Microsoft wants us to use these things, they are called tiles, but many people call them gadgets as well. Some of the tiles are "live". A live tile is basically the same thing as a gadget. Microsoft removed the support for Win7 gadgets in Win8, but stuff like 8GadgetPack solves that problem. But why would you want to run a gadget anyway? You can use a tile. If you don't like tiles and you want useful but ugly system statistics you can use Tinyresmeter and if you want to make your desktop look "cooler" you can use Rainmeter. Trio The Punch (talk) 15:13, 18 November 2012 (UTC)
- since I may find myself in the same position soon enough, what do you mean by gadgets? My Apple wireless keyboard (which works perfectly with Vista)? Wireless mouse? Or something more speccy? IBE (talk) 14:45, 18 November 2012 (UTC)
- I think you need something like 8GadgetPack to run Win7 gadgets in Win8. Trio The Punch (talk) 14:22, 18 November 2012 (UTC)
- A former programmer of our group made a scheduler that we use for specifics proposes, I guess the GadgetPac is what I need.
- Thanks for the link
Iskánder Vigoa Pérez (talk) 15:56, 18 November 2012 (UTC)
- Yep. And if it does not work for some reason you can try the other methods mentioned here. Trio The Punch (talk) 01:00, 19 November 2012 (UTC)
Idle computer power
editCan you offer the relatively underused processing power of your PC for some sort of distributed enterprise for cash? If not possible, can you offer it to some scientific endeavor? Comploose (talk) 17:41, 18 November 2012 (UTC)
- Click on some of the links in the See also section of the SETI@home article, BOINC & Folding@home. You can also run a Bitcoin miner. Or you can help poor webdevelopers. Trio The Punch (talk) 17:44, 18 November 2012 (UTC)
I would NOT recommend bitcoin mining with cpu. Your power cost will overwhelm any possible return. Bitcoin mining requires high end graphics cards to be done profitably, and even these are likely to become unprofitable when FPGA and ASIC mining comes onstream. Just google for Bitcoin Mining Hardware Comparison. Kram (talk) 20:29, 18 November 2012 (UTC)
In the old days, PC's used about the same amount of power whether they were busy or idle, so burning spare cycles on something interesting didn't waste any electricity. Today's CPU's (at least the faster ones) are quite power hungry when busy, so you will get a visible bump in your electric bill if you do something like mine bitcoins or SETI or whatever 24/7. Here in California a rough approximation is that 1 watt of power consumed 24/7 costs $1 a year. Max out a 100 watt CPU or a 300 watt graphics accelerator nonstop, and it adds up pretty fast. 67.119.3.105 (talk) 22:36, 18 November 2012 (UTC)
Bitcoin miner performance tends to be measured in MH/s (mega-hash per second), where a hash is a single SHA256 hash calculation. CPU's typically mine in single digits MH/s, GPU's in the hundreds. At current returns 1MH/s will mine approximately 10 micro-bitcon per hour, which is worth approximately 100 micro-USDollar. So a typical CPU will return perhaps a few dollars to a few tens of dollars per year. As mentioned above, the power cost is hugely greater. Furthermore the return on mining bitcoin is designed to reduce over time. It will halve at the end of November, though market forces will alleviate this reduction somewhat. On the horizon is ASIC-based hardware which should greatly increase the performance. In summary bitcon mining is a big-boys game, not for the amateur. Kram (talk) 22:49, 18 November 2012 (UTC)
- A bit of a Freudian Slip there, bitcon vs bitcoin, but I think I'll let it stand. Kram (talk) 22:59, 18 November 2012 (UTC)
- Hahaha. I know the solution for that problem: bitcoinmining via botnets. Trio The Punch (talk) 00:04, 19 November 2012 (UTC)
- According to our Bitcoin article, it's already happening. 209.131.76.183 (talk) 14:37, 19 November 2012 (UTC)
- I know. I found some sourcecode in a pastebin. Its a pretty smart way to monetize botnets. Trio The Punch (talk) 21:15, 19 November 2012 (UTC)
- According to our Bitcoin article, it's already happening. 209.131.76.183 (talk) 14:37, 19 November 2012 (UTC)
- Hahaha. I know the solution for that problem: bitcoinmining via botnets. Trio The Punch (talk) 00:04, 19 November 2012 (UTC)
- A bit of a Freudian Slip there, bitcon vs bitcoin, but I think I'll let it stand. Kram (talk) 22:59, 18 November 2012 (UTC)
- As to renting out your CPU for money, I find it somewhat unlikely that it can work profitably for you. There are cloud services that offer large CPU farms for hourly rent. You would be competing against them, and they have advantages like standardized service, reliability, storage, and network connectivity that a bunch of random home computers can't offer. You'd have to be ultra-cheap to compete, and then you'd have trouble paying for power and machine wear. A big room full of server cabinets is quite cost efficient compared to a bunch of home computers. Free CPU projects as listed above are a more likely thing to want your spare cycles. Even then, I wonder if the world were better served by a fat check from some billionaire to seti@home etc to buy some cloud CPU time - power use and global warming -wise. 88.112.41.6 (talk) 16:46, 19 November 2012 (UTC)
Web design
editHi, when people design cool, simple webpages like this one, what do they do? Firstly, do most web designers use a wysiwyg html editor, or do they hard-code? Do they mainly use templates, and just insert bits and pieces by cut and paste, or do they need to do a lot from scratch? And perhaps most importantly, where do they get their cool graphics/ pictures from? I'm aware there's plenty of stuff with Creative Commons licences, but I've done quite a lot of googling for them, and the bulk of what you turn up seems to be pretty drab, or totally inappropriate for any given website (i.e. some of the images look cool, but very few are at all general in their relevance). Thanks in advance, IBE (talk) 19:58, 18 November 2012 (UTC)
- For images, there are many stock photography websites like iStockPhoto (many of which also feature graphic design stuff). These are often pretty cheap and the licence you buy for a given image is sufficiently flexible for every use I've had. Some stock sites allow you to contact the photographer or designer directly, meaning if you find say a graphic designer whose style you think matches the project, you can mail them and ask "how much would you charge for a logo in the style of <this pic you did>, with an orange squid playing the xylophone?". -- Finlay McWalterჷTalk 20:16, 18 November 2012 (UTC)
- Hey that's rather helpful, I must say, because somehow it didn't dawn on me that there would be specialist sites aggregating creative commons images, like your iStockPhoto, but free. I had heretofore been using only google, with the licence filtering service. And so I found this. Call me an idiot, but isn't Wikimedia wonderful? IBE (talk) 20:44, 18 November 2012 (UTC)
- Commons can be useful, but if you're designing professional websites, you often need some rather dull images that people might not upload to commons - stuff like "business meeting" or "legal paperwork". Plus personality rights is a big problem - professional photographers get model releases from the models who appear (as "dentist" of "business lady" or "judge") in their photos - you essentially never get that on a commons photo. If you're talking about real professional web design, professionals pay what needs to be paid, they don't scrape around Flikr or Commons looking for some free thing they can use for nothing - because their time is expensive, and stock imagery is usually very cheap (and billable anyway). -- Finlay McWalterჷTalk 23:01, 18 November 2012 (UTC)
Most use horrible GUI apps and templates. This person probably used a decent GUI app on Mac OS, and did a lot of hand coding. ¦ Reisio (talk) 21:47, 18 November 2012 (UTC)
- So is the GUI app typically a wysiwyg, and you add hand-coded tweaks? Some apps I've come across for html are purely template based, ie. you click what you want and it just writes the tags to the document, and you edit as you go. Of course you can constantly refresh the page, but it's not nearly the same as wysiwyg. On the other hand, I've tried wysiwyg (eg. with Open Office), and it is basically awful. IBE (talk) 22:01, 18 November 2012 (UTC)
- I dare say most people use something that has WYSIWYG (and most of these have a code view, too). Actual professionals (in terms of code quality, not salary) do not use WYSIWYG. ¦ Reisio (talk) 22:09, 18 November 2012 (UTC)
- For a site that's being produced by a team, with a real graphic designer and a real web developer (rather than a good-at-neither hybrid person) the designer will often produce a Photoshop, inDesign, or illustrator mockup, and if that's approved they'll emit a detailed specification (fonts, rgb colours, often some css) and graphics layers (pngs hopefully) - which the developer will built from. A professional site (one that's more than just a brochure) will almost always be built using some kind of web template system; but indeed lots of other sites are jammed together with allegedly wysiwyg tools (there's no such thing as wysiwyg on the web, which real web designers know all too well). A sensible medium path is often found when people skin Joomla or Wordpress or whatever templates, which is often a happy medium for many modest sites. -- Finlay McWalterჷTalk 23:09, 18 November 2012 (UTC)
- I used at a place that did pretty big name websites. After getting information from the customer (or having a proposal accepted) on the general concepts and ideas they want to focus on, the creative department would come up with a few very rough proposals. These would be of the basic things such as what sorts of pages would be on the site, and how the content would help push the intended message towards the user. After one is approved by the customer, the a user experience designer would flesh out details of all the sections of the website. This includes layout, behavior of links and menus, and other details. This was done simultaneously with the creative department, which would refine the design and come up with more detailed graphics. The artists will design their own graphics, and photos of specific subjects (such as the product) are taken by the photographer. Other photos tend to come from a stock photo agency. By the end of the user experience phase, we would have a complete document with graphics representing exactly what each part of the site should look like and how it will respond to user interaction with every component. This document along with the creative assets (images produced for the layout) would then get passed on to the dev team. Depending on the site, it could be fully hand-coded, or designed to work in a CMS that the customer could take over (still with a lot of hand-coding involved). The dev team consults with the user experience and creative teams for anything they need clarified or drawn. Eventually, the site is completed and handed off to the testers with the same specification the developers used. Oh, and during the process creative writing works on making content for the site - generally the specification holds placeholder text. Other firms use different processes, and smaller-scale things can be done by one or two people, but the key thing to take away from this is that it is a well-ordered process of thought and design that makes sure the final product will be well thought through. The development doesn't even start until there is a good plan for every element of the site, and most of the graphics are designed before coding. Designing the site as you code often leads to more amatuerish design - if you've put dozens of hours into the site, then while solving a problem with it realize that things would have worked out better with a design change that would require throwing out most of your work, you end up motivated to do what you can to make what you have work, even if you have realized it isn't the best solution. 209.131.76.183 (talk) 13:19, 19 November 2012 (UTC)
Great answer folks - that last sentence hits home. Still, for a one-person operation, writing as you go is by far the simplest way to learn, although it wouldn't do for a professional job. More info always welcome, IBE (talk) 15:29, 19 November 2012 (UTC)