Wikipedia:Reference desk/Archives/Computing/2014 May 12

Computing desk
< May 11 << Apr | May | Jun >> May 13 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


May 12

edit

How can I modify the Javascript of a website I visit?

edit

Sometimes websites do quite annoying things with Javascript like disable right-click, play music, etc etc. Is there an easy way (add-on maybe?) for me to edit the Javascript of a website and reload the page with modified JS? I know there are dedicated addons out there that prevent right-click disabling, or writing your own userscript in GreaseMonkey but I'm looking for a broader solution. Something like: Visit site.com > edit javascript source file locally > reload the page with modified JS.-- penubag  (talk) 05:03, 12 May 2014 (UTC)[reply]

You could do something like that with a local proxy but reverse engineering and modifying dozens of chunks of potentially obfuscated JS sounds awfully tedious. I usually just use adblock and figure out the offending JS file by trial and error and block it (in practice, just block all the JS until the problem stops, occasionally making adjustments if blocking a JS makes the page unusable). You can also use noscript to block JS completely on a site-by-site basis, but it's a clumsy add-on in my opinion. 70.36.142.114 (talk) 05:28, 12 May 2014 (UTC)[reply]
Thanks but if I wanted to modify JS variables the blocking methods wouldn't work. Could you elaborate on the local proxy solution? -- penubag  (talk) 06:01, 12 May 2014 (UTC)[reply]
The local proxy just sits between the browser and the internet, rewriting addresses and content as you like. Privoxy is an example. 70.36.142.114 (talk) 06:42, 12 May 2014 (UTC)[reply]
I wasn't able to get Privoxy to work but I did some research and Fiddler has everything that I need to edit Javascript, POST data, Headers etc. Really glad I stumbled on that tool, it really has everything you'll ever need. -- penubag  (talk) 11:39, 12 May 2014 (UTC)[reply]
Cool, I didn't know about Fiddler. It looks like it's closed-source, which may become a pain if you want to do complicated rewrites that make you want to modify the code. It occurs to me, you could also use a caching proxy like Squid to serve your own edited versions of those js files instead of the remotely served ones. That would even speed up your browsing. If the JS is coming from some place like googleapis, maybe you could alternatively use /etc/hosts or the windows equivalent to direct to your own local http server. 70.36.142.114 (talk) 16:58, 12 May 2014 (UTC)[reply]

Storage on enterprise-class computers or supercomputers

edit

I have never even seen an enterprise-class computer or supercomputer in my life, not to mention getting to actually use one. The closest I have managed to do is getting to see and use a Sun E450 workgroup server in a former job. All the advertising and specification I have seen of such enterprise-class computers or supercomputers on the Internet only talk about CPU power and memory, not about storage. What do these computers use for storage? Do they have internal drives at all or do they use separate storage servers? JIP | Talk 18:47, 12 May 2014 (UTC)[reply]

Machines intended solely for computation or pumping data with modest transformation (blade servers, perhaps web servers) might have no disk at all - they might boot off a network boot server and work only with RAM (which they might have a large amount). Others are pretty much headless workstations, with a single hard drive (or perhaps an SSD and an enterprise-grade HD). An enterprise grade HD is usually 10000 or maybe 15000 RPM SCSI device. Beyond that machines can be direct attached to a storage array (a separate device with a bunch of disks in it, which might be configured as a RAID, ZFS, or JBOD); as some applications require high failure-tolerance, you might get several server machines cross-wired with a pair (or more) of storage arrays, where the controllers of each monitor their twin, allowing one controller to take the work of its twin should it fail. Interconnects for this might be iSCSI, Fibre Channel, or Infiniband. Beyond that, storage may be organised into a Storage area network (a fabric) consisting of a number of storage arrays interconnected (again, often with redundancy) with switches connecting to the Host Bus Adapters in the client machines. As one moves up the complexity spectrum (from locally attached storage to a fabric) things become more and more virtualised; the storage system implements a number of volumes (each consisting of lots of logical blocks), but it's increasingly unclear to a client machine where that volume actually is (it can be distributed across several disks in several arrays, and be delivered redundantly on several different connections via different paths). All of this SAN stuff works like virtual physical devices - the SAN supplies volumes and logical blocks, and clients build filesystems on those (much as they would on a local, single SCSI disk). The (well, an) other way to do things is the opposite - for the storage device to implement a file system and export it via a network filesystem protocol (NFS, CIFS (samba)), making it a Network-attached storage solution - a whacking great file server (although again, in practice these are built from components which they virtualise to give the illusion of a single system, built from smaller, redundantly configured and connected components). Perhaps the biggest player in this space is NetApp. Enterprise storage equipment can be characterised by a higher concentration on reliability, on hot-swappable-components, on a higher degree of internal monitoring, on the ability to configure redundant configurations for failover (for reliability), and on integration with enterprise monitoring and configuration tools (so someone can monitor, configure and manage a farm of storage devices from a single screen). All of this redundancy and reliability can mean that things get very pricey (if it wasn't for that, someone might be tempted to say to themselves "hey, I can replace this insanely expensive Netapp things with a homemade Linux box running Samba, a couple of RAID cards, and a chassis full of barracudas") - note the newer expansion of the RAID acronym vs the old one (smiley). Very very large storage providers (e.g. Google Storage, Amazon S3) work differently - they seem to throw lots of cheap hardware at the problem, relying on even more redundancy (the big guys tend not to be so open about their architecture, as that's their value-add); S3's cousin Amazon Glacier appears to be hosted, at least in part, on tape. -- Finlay McWalterTalk 20:53, 12 May 2014 (UTC)[reply]
Minor nitpick, but my understanding is there's great debate about whether Glacier uses tape, with some suggesting it may simply be normally disconnected HDs [1]; special low speed, low power HDs; optical media [2] [3] (mentions both).
That said, from what I've read I also wonder if a lot of the 'it's not tape' crowd are reading way too much in to minor comments from Amazon personnel. In particular, I don't see how Amazon telling people they should see Glacier as a replacement to tape and that it uses inexpensive commodity hardware components definitely means they aren't actually using tape. (Actually these would seem to dispute the idea of weird hard disks or any optical media besides standard ones which admitedly may include BDXL, more.) Other issues like climate control requirements and Amazon's durability guarantees will seem to depend more on redundancy and Amazon's plans for long term storage (which could easily be 'rewrite the data every X years' perhaps with different Xs for different copies).
Nil Einne (talk) 18:56, 13 May 2014 (UTC)[reply]
edit

I know this sounds like a strange question but does anyone know how I can set up MS Word 2010 to do numbering like this File:Numbering document underlined first word.jpg so the underlined number links up with the underlining of the first word? Thanks Amisom (talk) 20:03, 12 May 2014 (UTC)[reply]

Go to Options > Advanced, and locate Compatibility Options (at the bottom of the list). There should be an option to "Underline tab character in numbered lists." Once that option is selected, you should find that underlining both the number and the first word of the list item causes the space between them to be underlined also. This should work fine in Word 2010. (For Word 2013, you would have to use an older version of the Word file format in order to support this, I think.) —Noiratsi (talk) 09:14, 13 May 2014 (UTC)[reply]

How to bring back tabs and URL

edit

At a library I discovered something that I have rarely seen before. The tabs at the top of the page and even the space where the URL goes just disappear. I asked for help and was told to move the mouse to where I want these to appear, and how long it takes for them to appear varies. It's very annoying. I hope whoever can fix this will be back the next time I go to that library.— Vchimpanzee · talk · contributions · 21:29, 12 May 2014 (UTC)[reply]

It's not broken. What it for is to allow more viewable space on the screen. My tablet's browsers all do this, and as a matter of taste, I prefer it. However, I'm surprised that the library you use has this enable as it is not user friendly to inexperienced computer users. As a public librarian myself, anything that confuses them is a very bad (and time-consuming) thing. Any idea what browser you were using? Mingmingla (talk) 22:11, 12 May 2014 (UTC)[reply]
Long side thread about Microsoft even though the question was about Firefox
I was helping a neighbour with his Windows 7 machine, and that little "feature" in (ugh) Internet Explorer almost drove me to break it. Hovering didn't help, clicking didn't help, only shortcut keys (the ones I remembered, anyway, and none of the Menus, just actions). I'm a fairly experienced computer user, but I have no idea what Microsoft is trying to accomplish this last decade or so. Convinced my neighbour to at least get Firefox. I suggest you get a better library. InedibleHulk (talk) 23:47, 12 May 2014 (UTC)[reply]
If you need a refresher on all that Microsoft accomplished in the last ten years, here's a list of important dates provided by their corporate press office. Ten years ago, they began returning 75 billion dollars to shareholders; they launched a series of very popular video game consoles; shipped several major revisions of their operating systems that make billions of personal and embedded computers work; acquired Skype and Nokia; and created a few hardware platforms, like the Surface and the Xbox; and ported Microsoft Office to work on the iPad. The corporation also smoothly transitioned beyond its founder, who stepped out of the inner circle to focus on operating one of the world's largest charity programs, the Bill and Melinda Gates Foundation. Not everything Microsoft have done has worked flawlessly, but before you nitpick them, you might want to reconsider what you have accomplished in the last ten years. Nimur (talk) 00:24, 13 May 2014 (UTC)[reply]
If you knew me, you'd know exactly how unimpressive any of that is. Probably as impressed as you'd be if I did tell you about my days of glory. Didn't mean to touch a nerve, Microsoft is good at what they do. But making a product I want to use generally hasn't been what they do (Hotmail's alright). I'm glad they could help you. InedibleHulk (talk) 04:54, 13 May 2014 (UTC) [reply]
I'm with Hulk on this one. Let's look at what they actually accomplished.
  • they launched a series of very popular video game consoles: whatever; it's the worst of the Big Three. It lacks both the raw power of the PS3 and the innovation of the Wii. It's a dumbed-down PC By Any Other NameTM.
  • shipped several major revisions of their operating systems that make billions of personal and embedded computers work: With the one major feature of being "new" rather than better. "Lack of innovation" would have been a good thing this time.
  • acquired Skype and Nokia: Who. Cares. As the joke goes, "they could have downloaded Skype for free".
  • ported Microsoft Office to work on the iPad: if they ported to MacOS, that would have been worth writing home about, but an office suite on a tablet is like porting a microwave app to a fridge.
  • Not everything Microsoft have done has worked flawlessly: [citation needed]
  • but before you nitpick them, you might want to reconsider what you have accomplished in the last ten years: I'm pretty sure any Wikipedian would have a hard time trying to waste as much usertime as Microsoft did with Vista.
That's my $0.02 on Microsoft's recent exploits... - ¡Ouch! (hurt me / more pain) 07:03, 13 May 2014 (UTC)[reply]
Um? Microsoft Office for Mac has existed since 1989 Microsoft Office#Mac and iOS versions which is before the Windows version was released, the current version was released about 3.5 years ago and the next version is due sometime this year [4]. (Even Microsoft Word for Mac was launched about 2 years after the first released.) Office for Mac is sometimes criticised for various reasons, most particularly compatibility problems between documents saved in the Mac and Windows versions, the ?historic poor support for complex scripts and the lack of Microsoft Access. And there's a slight possibility Microsoft may have abandoned Office for Mac in the dark days of Apple were it not for antitrust concerns although some commentators disagree [5]. Either way, your comment above makes little sense. Is the iPad version even derived from the Windows version or is it really derived from the OS X version?
As for the PS3 vs the 360, it's well recognised that the many game developers hate/d the the PS3 and the Cell (microprocessor) due to the difficulty coding for it. It's little use having a more powerful processor if few people know, or want to know, how to use it to the extent you rarely get an advantage. (The PS3 did have the advantage of having a Bluray drive and the larger capacity that this resulted in, but this isn't really a raw performance advantage.) It's intended to be a game console after all, not a research device.
Sure things may have improved somewhat near the end and perhaps some a few games gained an advantage from the greater theoretical performance but it doesn't seem to have been many. I suspect Sony wasn't helped by the fact there's probably less in house engine development (which is a double edged sword since on the one hand it means the developers don't have to worry so much about the odditys of the Cell but on the other hand a crossplatform multipurpose engine is less likely to be able to do weird stuff using the unique features of the specific consoles it's being targeted at based on the specific needs of the game); a lot less exclusivity i.e. most titles are at least developed with the assumption they may be cross platform (even if it's only PCs running Windows) at some stage; and overall less drive to really push the envelope using weird stuff then there may have been with past consoles when developers felt they were significantly limited by the hardware or could do something really cool. (And I'm not completely sure how much the SPEs really helped the biggest problem for both namely the limited amount of RAM that became increasingly evident.)
The biggest sign of this is the fact both Microsoft and Sony chose something which is very like a PC in this generation including with an x86 processor (as ironically was the original Xbox). Actually the PC I'm using right now which is connected to a TV but running Windows 8 is somewhat similar although it has a Trinity and so the CPU may or may not be more powerful but the GPU is much less powerful, and it also lacks HSA. (Memory bandwidth is similar to the Xbox One. It could probably be higher the APU and/or motherboard has problems running the memory at rated speed.)
Of course all this depends on what you mean by a PC anyway. When most people in the developed world thing of a PC, they probably think of a x86 device probably running Windows or perhaps *nix. Macs, even now that they are very similar to PCs in terms of hardware often aren't considered PCs, probably because of Apple's successful marketing. Despite the name, very few people would have much experience with something using a PowerPC CPU which they would call a PC by the time the 360 came around (which was around the time Apple was transitioning to x86). Not to mention the Xenon (processor) was as I understand it, derived from the Power Processing Element from the Cell which used the Power Architecture but is distinct from other PowerPCs and note used anywhere besides the Xenon and Cell processors AFAIK. Meanwhile there were I believe some Cell based workstations which could be considered PCs (although more likely people with workstations many simply have a Cell accelerator card i.e. with a distinct CPU if they had any Cell in their workstation).
In other words, in many ways the PS3 is as much a PC as the 360 was. And if anything, you could say the PS3 is more like some PCs than the 360.
So again this is a fairly odd statement. If anything the current generation PS4 vs Xbox One seems to be a better example where while both choose something that's very like a PC, the PS4 is clearly more powerful on the GPU side thanks particularly to the higher memory bandwidth and more compute units (and perhaps in the future to the the more async compute units). The Xbox One does have eSRAM but this doesn't generally seem to help anywhere enough (possibly because there isn't enough and they couldn't fit more [6] [http://attackofthefanboy.com/news/xbox-esram-small-1080p-microsoft/), and the minor CPU advantage likewise would generally not be significant. And definitely early indications are that the PS4 is winning where it matters, i.e. among consumers. (You could probably say the Xbox One is slightly more likely a PC than the PS4 since GDDR5 as system memory probably will never be possible for anything besides a highly specialised system like the PS4. Although realisticly it's likely DDR4 combined with triple channel may mean you could achieve about equivalent memory bandwidth on a more normal APU in a year or two if AMD stick around with it that long and are able to get the memory controllers up to scratch (although I'm not that sure how good the PS4 one is anyway). And I have doubts AMD is going to do anything like eSRAM on their main APU lineups.
This is unlike the last generation where the was no clear winner [7]. Both Microsoft and Sony had various problems. E.g. Sony with supply and cost issues due to their more expensive system and of course the PlayStation Network outage. Microsoft with the Red Ring of Death. Even Nintendo's early success with the Wii seems to have later started to falter. In particular, while they did end up selling more units, they also I believe sold fewer games/unit which is of course significant since games tends to be where they make most of their money. And it's looking like they're even worse off than Microsoft in this generation, although they are somewhat out of sync anyway. (Of course Nintendo probably isn't helped by the fact that a key part of their target market has smart phone and tablet games, and perhaps those Facebook ones, to keep them entertained. And both Sony and Microsoft copied their key innovations in their own way. Perhaps without that much added success but with enough to mean they are no longer so appealing.)
In fact, Microsoft attempts at innovation such as the always bundled Kinect, possibly including family game sharing (or at least the earlier plans where it's compulsory with the associated requirements such as regular online checks and games being associated with an account rather than the media [8]) and possibly including eSRAM (although as hinted above, it may also just be the early state of the tools and it's perhaps also likely the Xbox will be a lot worse without the eSRAM) have mostly been abandoned or otherwise unsuccessful. In many ways, Sony stuck to the basics and so far seems to be having success, Microsoft tried something different, and mostly seems to have failed at least at the moment. (Of course it's still early days, too early to definitely declare any winner or loser.)
It's of course rather funny that this thread seems to have started based on criticism of a feature which is in most browsers. In fact I've personally encountered it in Chrome and Firefox much more than IE which I rarely use. And while it is slightly annoying it is easy to resolve if you know the F11 shortcut key which I suspect is how it was activated anyway. Even more funny that if I understand the comment below, the OP wasn't even using IE but Firefox. Was this feature even introduced first in IE? I can't really recall (although I do know it greatly preceeded the iPhone and the full sized touchscreen smart phone era).
P.S. While Vista has its flaws, it very likely saved me more time than it wasted compared to XP.
Nil Einne (talk) 16:15, 13 May 2014 (UTC)[reply]
It sounds like the Full Screen feature. You should be able to find it in the settings. Where in the settings will depend on which version of which browser your library uses. Dismas|(talk) 00:17, 13 May 2014 (UTC)[reply]
If Dismas is right and this is just the brower's ordinary fullscreen mode, most browsers use F11 to turn it on and off. Saves having to dig through the settings menus. —Noiratsi (talk) 07:20, 13 May 2014 (UTC)[reply]
Thank you, Noiratsi. It worked. I had dealt with this before years ago and the Computing Reference Desk gave me the solution but I didn't have any idea how to look for it. I forgot to mention Mozilla Firefox. And no, I'm not getting a better library. I can walk to this one. I go to other libraries but not daily.— Vchimpanzee · talk · contributions · 12:53, 13 May 2014 (UTC)[reply]