Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Windows PowerShell in Action 442

jlcopeland writes "For two decades I've hated the command prompt in DOS and Windows. Inconsistencies abound and everything is a special case. The fallback on a Microsoft box has been running a Unix shell under Cygwin or installing Microsoft's own Services for Unix (or its predecessor, Softway's Interix), or by scripting in Perl, but those only get you so far. Having co-written nine years worth of trade rag columns using mostly Perl as the implementation language for the samples, and thinking of every problem that comes across my desk as an excuse to write a little bit of scripting code, I've got some well-formed views about scripting languages and what works and what doesn't. That means I've been eagerly watching the development of PowerShell since it was called Monad. It's got the advantage of being a unified command-line interface and scripting language for Windows, even if it does have a dorky name." Read the rest of Jeffrey's review.
Windows PowerShell in Action
author Bruce Payette
pages 576
publisher Manning
rating 9
reviewer Jeffrey Copeland
ISBN 1932394907
summary Guide to PowerShell, the new Windows scripting language


Bruce Payette's Windows PowerShell in Action is a great overview of PowerShell, aimed at an audience that's got some experience with other scripting languages. Bruce's book is a big improvement over Andy Oakley's earlier book, Monad, which I had been using: it's more complete and it's up-to-date for the first release of PowerShell. It's got great (and sometimes amusing) examples, and feels like the Perl Camel book in flow. When I was reading it in the gym or someplace else away from the keyboard, I kept wanting to run back to the office to try something out. There are also useful "why it works this way" digressions, which provide a lot of context. Since Bruce was on the original development team, wrote most of the commandlets, and was responsible for much of the language design, those digressions are more authoratitive than the directors' commentary tracks on most DVDs.

In outline, the nine chapters in the first part of the book build up as you'd expect: overview and concepts, to data types, to operators, to regular expressions, to syntax, to functions, to interpreting errors. It covers that ground better than many language books that now litter my shelves. The explanations are clear, and the examples are almost all exactly on point. It took me a second reading to realize that my complaints about the regular expression sub-chapter wasn't about the chapter itself, but about some of the implementation decisions; that's an argument about style more than substance, and an observation about me, not about Bruce's writing or PowerShell. The first part of the book is the "mandatory reading," if you will, to get the language down and begin exploring on your own.

The second part is where the real applications are covered. That's the part that you especially want to read sitting next to the keyboard. As you'd expect, the example code is available from the publisher's web site to start you off — look for "Example Code" under "Resources." There's a very good discussion of text processing and how-to-handle XML, complete with some not-obvious warnings about traps to avoid. I've been working very carefully through the really good chapter on using GUIs with PowerShell, "Getting Fancy — .NET and WinForms," and my own proof of concept for that has been rebuilding an old C++ data entry application into a much simpler PowerShell script. As a nice side effect, Bruce's book (and the WinForms chapter in particular) provide a gentle overview to some concepts in the .NET framework, which I hadn't had an opportunity to delve into. The appendix on using PowerShell as a management application will be especially useful to system managers; that was one of the original PoweShell target audiences, and the language achieved that goal very well. The appendix on the language's grammar is really useful, and I keep flipping back to it to check on things.

After Oakley's Monad appeared, there was a long gap before the next PowerShell book appeared. Bruce's book looks to be the first of the post-release wave. If all it had going for it was the authoratative pedigree of the writer, it might be worth it, but it's also well-written, well-organized, and thorough, which I think makes it invaluable as both a learning tool and a reference.


You can purchase Windows PowerShell in Action from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Windows PowerShell in Action

Comments Filter:
  • PowerShell (Score:2, Interesting)

    by El Lobo ( 994537 ) on Wednesday May 02, 2007 @02:59PM (#18960853)
    is one of the nicest things that came out of Redmond. Ever. My only complain is that it is tight integrated to .NET, but OTOH I believe this is necesary to integrate the always nice C# to it, which is of course a plus.... You can't have the cake and eat it...
  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Wednesday May 02, 2007 @03:03PM (#18960913) Homepage

    Powershell is very powerful. Much more so than cmd.exe. I don't have significant enough experience with bash to compare the two but I would not be surprised to learn Powershell equals if not beats bash at the shell game. I wouldn't say it is ready to replace any of the scripting languages just yet.

    I have been using it for a while now and the single (semi-major) problem I can find is memory usage. It is a hog at best, and at worst when you are using it semi-heavily it can easily chew up 1GB of memory. That's even with giving the GC something to work with, ie unsetting $vars when you are done with their data.

  • by Brunellus ( 875635 ) on Wednesday May 02, 2007 @03:05PM (#18960933) Homepage

    I wish they'd kept "monad" as the name. It was a deft tip of the hat to Leibniz's Monadologie, which held that monads were the windowless metaphysical atoms of perception itself.

  • Great Review (Score:5, Interesting)

    by water-and-sewer ( 612923 ) on Wednesday May 02, 2007 @03:14PM (#18961047) Homepage
    I don't use Monad (:s/M/G/g) or intend to, so I don't care much about the book. But what a great review. We get a lot of amateur reviews around here, but this one was particularly well written, clear, and informative. Nice job, homie.
  • by raphae ( 754310 ) on Wednesday May 02, 2007 @03:20PM (#18961127)

    a Unix shell under Cygwin or [...] but those only get you so far



    What is wrong with Cygwin? How can he bash Cygwin (sorry, no pun intended) without even bothering to say anything about it?
  • by 3m_w018 ( 1002627 ) on Wednesday May 02, 2007 @03:31PM (#18961275) Homepage
    It's slower than cold molasses up a hill.

    It takes a few seconds for the prompt to appear, and if I run a "dir" operation with both cmd.exe and PS in a directory with hundreds of files, cmd.exe will beat it in seconds.

    I'm not running a slow machine(core duo 2, 1GB of RAM). Is there something that needs to be configured to make it suck less?
  • by harry666t ( 1062422 ) <harry666t@nospAM.gmail.com> on Wednesday May 02, 2007 @03:44PM (#18961521)
    harry@satan:~$ apt-cache search powershell
    powershell - powerful terminal emulator for GNOME
  • Re:At this rate... (Score:5, Interesting)

    by archen ( 447353 ) on Wednesday May 02, 2007 @03:54PM (#18961733)
    There's actually a lot of other cool things about it as well. I've been using Jscript to do stuff since Win2k (screw that VBscript garbage) but there are obvious limitations at the scripting level, but in the end you're always stuck in cmd.exe

    I was skeptical when I first heard about Monad. I mean it seemed obvious to me that Microsoft just didn't get the point of a "shell" which is supposed to be simple. With a pending install of Exchange 2007, the power shell is required so I figured I'd start to dig into it. I have to say I'm rather impressed.

    First of all, it is actually simple. Not only that, but the syntax is EXTREMELY CONSISTENT. And honestly I cannot stress that enough, because if you think you know part of a command you can usually figure out the verb/noun syntax to use. It also allows shortcut versions of commands so you don't have to type the entire "wordy" version of the command. Aliases are supported too. Another cool feature? You can navigate the registry like the rest of the file heirarchy.

    I'm a big fan of bash, but I must admit that at times it gets old shuffling stuff with awk and cut and so forth. By getting objects you can take what you want out of it, and not worry about the biggest Unix terror - the text output changing. If whatever you're trying to do doesn't support .net objects, you can still pipe text.

    Overall it's pretty awesome technology and I must give MS credit where it's due. Not that I'll be switching any of my FreeBSD servers to Windows because of it, but it makes windows administration orders of magnitude better. Too bad it got dumped in Vista. I've heard it will be included in service pack 1 though.
  • by mangu ( 126918 ) on Wednesday May 02, 2007 @03:55PM (#18961759)
    it treats piped arguments as objects instead of strings, Psh lets programs access the data directly, instead of having to manipulate large amounts of textual data with tools such as grep or PERL.


    Then they have absolutely no idea about what they are doing. The one big advantage in using pipes is that any application can handle text data.


    Let me give one example: I use the sort command all the time, it sorts data by lines of text, lines are compared according to criteria passed in command options.


    Now, imagine if it depended on binary objects. For every sort operation one would have to write a comparison function to decide which object should come before the other. Writing a special function would mean declaring some form of callback, or maybe some people would call it a closure, whatever. And so on.


    Here's one simple command I use when a disk starts getting full to see which directory is the data hog:
    du -x / | sort -nr > mem.txt &
    What this command does is check the disk usage (du command) in the root directory (/) without looking at symbolic links to other disks (-x option). The result is piped to the sort function, where it's sorted by the numeric value of the first column in reverse order (-nr option). The sorted result is sent to a file named mem.txt. Since checking the whole disk may take some time, it's done in the background (& command). After it finishes, I have a file with the size of each directory in the disk, one line per directory, ordered by size, larger directories first.


    See how powerful it is, having data represented as text? Try writing this line as a Powershell script.


    Another advantage of having data in text format is that you can test it using the keyboard and screen very easily, no need to run a special debugger.


    Instead of trying to reinvent Unix poorly, Microsoft would do a favor to its customers if they accepted Unix is a mighty fine OS and adopted without shame its best features.

  • by sootman ( 158191 ) on Wednesday May 02, 2007 @04:04PM (#18961907) Homepage Journal
    For those who haven never seen Monad in action and are quick to bash (ha! get it?) Microsoft's new shell, take a look at these two [msdn.com] videos [msdn.com]. You'll see that it's much more than just bash on Win32. In fact, if it ever catches on, it'll be Unix's turn to play catch-up, because some parts of it are pretty damn amazing. (Note that those are from 2005, but AFAIK, there haven't been substantial changes.)

    The whole point of Monad is that it's not just text, it's objects. So, unlike Unix, where you work with a command and then filter its output (which is just text), the output of Monad, while looking like text, is actually kind of like pointers back to the real thing, so instead of just doing a Unix-style command | filter | filter, you can say "OK, run this command, now of the things it output, go back and tell me this and this about them." Like, "Of these things that are running, tell me which five are using the most CPU time, then tell me the version of each, and how much memory they're using."

    PS: "...even if it does have a dorky name"--which name were you referring to: the one that sounds like 'testicle' or the one that makes me think of the Lottery? :-)
  • by Evilged ( 1096629 ) on Wednesday May 02, 2007 @04:11PM (#18962035)
    I have to manage a Windows domain of a 1000+ users and Powershell (yes crap name) allows me to do stuff that you used to only easily do with expensive third party stuff quickly in a few lines.
    The only other choice to do something similar on Windows is VB script which I find painful at best. It may not be the best Shell ever but at the moment its the best Windows integrated shell with access to .Net, WMI etc. and it beats Active Directory administration through a GUI any day.
    The book is great by the way, and his blog has loads of useful tips too.
    Anybody who is actually going to try it, will find the powershell community extension very useful. G
  • by Brit_in_the_USA ( 936704 ) on Wednesday May 02, 2007 @04:16PM (#18962159)
    I hope this is not a dupe - I certainly was not aware....
    ...PowerShell is avlaibel for MS OS's older than Vista too:

    http://www.microsoft.com/windowsserver2003/technol ogies/management/powershell/download.mspx [microsoft.com]
  • by Bloody Templar ( 702441 ) on Wednesday May 02, 2007 @04:18PM (#18962193)
    That's not an apples-to-apples comparison, though.
     
    In cmd.exe, "dir" is "dir." It gives you a text listing of the files and subdirectories of your current directory. In PowerShell, "dir" is an alias for "get-childitem," which returns an object array that can either be parsed for display in the console, or passed down the pipeline to another commandlet.
  • by MS-06FZ ( 832329 ) on Wednesday May 02, 2007 @04:38PM (#18962529) Homepage Journal

    has microsoft submitted to linux and unix? we have had a "power shell" for a few decades now..
    You sure about that? I think we have crusty old 1970s shells with a veneer of tab-completion and command history added for convenience. I would really like to see that situation change. I like the CLI and I think it's a powerful way to work - if the shell is up to the task. CLI shells ought to be able to, for instance, access the GUI (if any), as well as interact with running applications. These tasks can technically be done with just about any shell - but only in the sense that a program that runs under the shell can do some task and report back information to the shell environment - and the ability to do that is limited by what data types the shell can handle. Bash can handle one datatype, basically - text. It can't handle structured data, it doesn't really support binary data or numbers, let alone live objects you could interact with. I think this is a source of a lot of busywork in traditional shells - you run a program that, say, prints out numeric data from a matrix file, then to process that data the next program in the chain needs to parse the overall output, convert the numbers back to binary, and then probably re-format and print them out as text again. It makes no sense.

    I really, really want the Linux CLI to be modernized. (Lots of time is spent working on the Linux GUI, but it seems like when it comes to the CLI people are content to let it rot.) I've spent a lot of time trying to figure out how I would go about doing that. I've read a bit about PowerShell - it seems interesting, at least, and promising. For instance:
    - It can wrangle live .NET objects and complex datatypes
    - It encourages a unified interface (conventions for command options, etc.) for CLI programs and utilities
    - It applies these new techniques in conjunction with existing, traditional shell mechanisms, like pipes.
    - It endeavors to make documentation and general information about commands easier to access

    Now, there's also things I don't like so much - for instance, the distinction between "commandlets" and normal commands. (To be fair, this is largely due to the fact that most existing code in the world is written either for a traditional CLI or a GUI - so most code isn't going to know how to deal with a smart CLI anyway. But I think there are better solutions.)

    I think it's kind of a drag that Microsoft may now have a better CLI than Linux - but I think that's a situation that can be changed.
  • Re:Prompt (Score:2, Interesting)

    by Evilged ( 1096629 ) on Wednesday May 02, 2007 @04:39PM (#18962549)
    Yes you can, you set it in your Powershell profile, look for the function prompt.
    [datetime]::now.ToLongTimeString() give current time btw.
    G
  • Bean Shell (Score:4, Interesting)

    by hachete ( 473378 ) on Wednesday May 02, 2007 @05:18PM (#18963271) Homepage Journal
    The Bean Shell. That's it. That's what Gonad is trying to copy. Though I forget - who's trying to copy who this week?

    Same problems too as well - memory consumption up the wazoo and slow as hell. Every time you've got to do a "pipe" you need to look at miles and miles of API.
  • Re:At this rate... (Score:3, Interesting)

    by Tanktalus ( 794810 ) on Wednesday May 02, 2007 @05:22PM (#18963357) Journal

    Why not just use "rename"?

    rename .ext1 .ext2 *.ext1
    I'm actually thinking of something just a bit more complex:

    for file in *.avi; do ffmpeg -i $file -target dvd -aspect 16:9 ${file%avi}mpg; done
    Not that I have a use for something like that, mind you...
  • by Anonymous Coward on Wednesday May 02, 2007 @06:02PM (#18964073)
    Probably painfully slow. I don't know enough scripting to actually benchmark it ... but on C:\windows\system32 (> 2k files) "dir" via CMD was done in well less than two seconds. After two seconds, PowerShell was barely up to the letter C.

    But, as another poster mentioned, with the performance hit comes the ability to treat each item returned by dir as a .NET object. You just can't do that with CMD.

    Still, if all you need to do is list a directory's contents then you don't need PowerShell. With PS I can -- from the command-line -- manage remote computers via WMI (and let me tell you, the occasional ability to troubleshoot remotely without firing up VNC is stellar), with some custom scripts I can manage Active Directory (having to simply type 'add-employee "Joe Sixpack"' is ... well, stellar). It's a very powerful tool at only a 1.0 implementation, so it'll only get better... /Posting anonymously not because I'm an MS shill, but because I don't want to sacrifice my mod points on the article.
  • by MS-06FZ ( 832329 ) on Wednesday May 02, 2007 @06:04PM (#18964119) Homepage Journal

    Text is universal, though, and it lets you have a single means of output for simpler programs. Want to read the results immediately? Just call the program. Want to save the results? Dump them to a file to review later.
    Text is only universal because we've made it universal. This is the same idea behind XML - it's mainly useful because it's recognized. And text isn't "human-readable" - it's binary just like everything else. It just happens that the process of displaying it is rather simple and the programs to do that are already out there.

    If you look at Powershell, they've got the "read immediately" process down - commands like "Format-Table" come to mind. Yeah, big deal, right? It's what PERL was born to do. But nevertheless - if you run a command and it generates an object, a meaningful printed representation of that data appears on the console. If you want it to look nicer, there are commands to format the output. I think the shell would be better if the user didn't need to handle that step explicitly - I have some ideas for how that could be done. (I am very interested in writing a Linux shell with these kinds of capabilities.)

    This makes it trivially easy for programmers to modify the output, or for the users to use it in unexpected ways. It means we don't need a separate program to convert binary data to a human-readable form first.
    From my perspective you've got it backwards. Every single program in a pipeline chain is burdened with the job of converting "human-readable" data to a useful, processable form - and then back to text again for the next chump in the line. So maybe you save one step, because you don't need to reformat the output of your last command - but you've added on two steps for every command in the chain (minus one, for the first) - and when programs start to get even moderately outside the realm of the common, everyday stuff, the user starts having to deal with those processes themselves.

    Complex datatypes in a shell are only good if you're using a set of languages that can deal with the same objects. With Unix, not all your languages have a concept such as an object -- not even a struct. Even then, you need a human-readable form for them, even if they're converted at the user's request.

    As long as Monad has that, it's probably decent. But that's going to be application-specific.
    True, that is a problem. Not so much the languages' limitations in handling objects - that's just syntax. You can write object-oriented code in C if you feel like it, and certainly C can interface with object managers like CORBA - it just wouldn't be especially fun. It's more the problem of having a common communication format - that's a hard sell because people don't think they need it, and it's a bit of a tall order to mandate something like that.

    In what sense is Monad's pipeline communication format "application-specific"? It can deal with any .NET object. If you had an object with a field called "Title" that had a value "Foozle", you could access that any way you like. It's not application-specific. You want a human-readable form? "Title : Foozle" pretty much does it. :)
  • by Mia'cova ( 691309 ) on Wednesday May 02, 2007 @08:53PM (#18965955)
    The default security blocks them. To run unsigned stuff, you have to set it to accept those scripts. Just a one line command you'll see at the top of almost any blog/tutorial posting on powershell, nothing painful.
  • by DragonWriter ( 970822 ) on Wednesday May 02, 2007 @10:57PM (#18967191)

    ow, there's also things I don't like so much - for instance, the distinction between "commandlets" and normal commands. (To be fair, this is largely due to the fact that most existing code in the world is written either for a traditional CLI or a GUI - so most code isn't going to know how to deal with a smart CLI anyway. But I think there are better solutions.)


    Its not so much dealing with a "smart CLI"—that the interface is a command line one is irrelevant—as dealing with "returning objects to an OO platform". Programs not designed to run on on OO platform (like anything designed to run on Unix or Windows but not something like .NET or the JVM) clearly aren't going to be designed to do this since they have no such platform to return objects to. And most .NET or JVM tools aren't designed to do that, because even though Java shells and Powershell exist now, they haven't for much of the time that the platforms have existed, and aren't the usual way most programmers expect their programs to be used.

    OO shells running on OO platforms will enable new ways of chaining programs together (not just from CLIs, either).

    think it's kind of a drag that Microsoft may now have a better CLI than Linux - but I think that's a situation that can be changed.


    Well, sure, the .NET platform has a CLI that allows you to do OO things. So does the JVM. Linux doesn't, because Linux isn't an OO platform, and while OO platforms exist that run on Linux, none is as central to the OS as .NET is becoming for Windows. OTOH, a "smart CLI" doesn't seem to get you much that an OO scripting language that supports interactive sessions combined with a conventional object serialization format would seem to enable even without an OO platform (and both have similar limitations with regard to legacy programs), so I don't know if Linux needs to have something like .NET to match PowerShell.
  • by mnmlst ( 599134 ) on Wednesday May 02, 2007 @10:59PM (#18967207) Homepage Journal

    Thanks to some poor choices in my younger days, I have become a full-blown Microserf herding along 250 Windoze servers, half of them in remote locations. If I had it to do all over again, I would have taken the red pill. This may offend the *nix snobs here, but if MS gets really serious about MSH (the way I keep seeing it when running PowerShell), it will be awesome. I haven't seen anybody here mention that it is built-in with Exchange 2007 and when you run through an E2K7 wizard, the last step is the display of the MSH script that will execute once you click the Finish button. It's also just waiting for you to copy and paste that script before clicking the Finish button, so you can expand it and reuse it later.

    My boss is such a Windoze junkie, he pooh-poohs my scripting efforts at every turn. We often spend hours and hours doing repetitive crap in the GUI's because "we don't have time to work out a script now!". I have avoided getting really deep into cmd.exe and VBscript approaches ever since I first read about Monad during the betas as that crap should be passing away. I've been bursting at the seams for some good books to come out.

    Beware a first effort from MS. If they get serious, the third version will be quite good. In the meantime, a wise sage told me to expect third party vendors to jump on this bandwagon and cook up gobs of stuff to leverage the PowerShelll to save Win Sysadmins keyboard time with canned scripts. That would leave me sucking garbage in the MS Matrix with the rest of the Duracells, but fortunately my boss won't spend any money on decent tools, so I will get to hack out the scripts by hand and really learn MSH. Awesome.

    If you're a Win Sysadmin reading this, be sure to check out http://www.sysinternals.com/ [sysinternals.com]Sysinternals and download the Misc utilities package, especially pstools.exe I use them all the time like a telnet session (via RPC) into remote PC's to clear up networking problems on them. netsh.exe then allows me to remove freakin' static WINS and DNS entries in TCP/IP properties, all without disturbing the user. It doesn't take long to learn and it saves gobs of time.

    Now I need to get back to my Linux lessons so I can use some discrete Linux servers on our edge networks, then they can start appearing closer and closer to The Core.

  • Re:At this rate... (Score:3, Interesting)

    by Furry Ice ( 136126 ) on Thursday May 03, 2007 @12:00AM (#18967703)
    Rather than wait years to learn how to do this kind of parsing, just start learning now.

    The solution to the problem of separating out "Jan" in the date from "Jan" in the filename is to realize that position matters.

    The data returned from "ls -l" is formatted in columns (that's why the filename comes last--it could have spaces in it!).

    I'm sure there are a lot of tools that can do this task, but awk is often a good replacement for grep when you need to look at a specific column.

    Here's some sample data:

    ls -l
    drwxrwxr-x 3 phiggins phiggins 4096 Jan 22 2005 www
    -rw-rw-r-- 1 phiggins phiggins 5746 Apr 10 13:32 xorg

    You can see the month is column 6, so let's try out some awk:

    ls -l | awk '$6 == "Jan"'
    drwxrwxr-x 3 phiggins phiggins 4096 Jan 22 2005 www

    awk is a bit weird and takes some learning, but the most important thing to remember is that many characters that awk uses will be interpreted by the shell, so always put your awk script inside single quotes. If you try to use double quotes, you're going to beat your head many times trying to figure out what's wrong with your program.

    You can match on a regex using:

    ls -l | awk '$6 ~ /Ja/'

    The part I've left out is the action part. An awk line consists of a pattern and an action. If the pattern is left out, it defaults to the pattern that matches every line. If the action is left out, it defaults to the action of printing out the whole line.

    Actions are enclosed in curly braces (one of the key reasons to enclose your script in single quotes!). Probably the most common thing to do is print out a subset of the fields of the line:

    ls -l | awk '$6 ~ /Ja/ {print $9}'
    www

    That will print just the filename. I've been slowly picking up more advanced pieces of awk over time, but even if you only know these basics, it can really help parse a lot of things that grep is unable to deal with!
  • PowerShell (Score:3, Interesting)

    by IpalindromeI ( 515070 ) * on Thursday May 03, 2007 @09:46AM (#18971603) Journal
    I've been using PowerShell for a couple months, and it's definitely better than cmd.exe. However, it's really built more like an interactive scripting environment, with shell features left as an afterthought.

    Only the most basic redirection is implemented. Basically you can use "> file", "2> file", and "2>&1". That's it. You can't create arbitrary fd's and dup them. It's like they didn't realize that the '1' representing stdout and '2' representing stderr actually mean something more general. Oh, and no input redirection. Trying to do:
    somecommand < file
    gives you this error:
    The redirection operator '<' is not supported yet.
    At line:1 char:14
    + somecommand < <<<< test.txt


    Although you can technically use "2> file" to redirect errors, it's actually a big pain. Say your program outputs to stderr for various warnings, and you want to capture those. Well because everything is an object, each error line is converted to an ErrorRecord object, which is then serialized. Unfortunately, the serialization of an ErrorRecord includes a bunch of other clutter. Here's an example of one error line:
    > perl -e 'warn qq[Something bad happened!\n]' 2> out.err
    > cat out.err
    perl.exe : Something bad happened!
    At line:1 char:5
    + perl <<<< -e 'warn qq[Something bad happened!\n]' 2> out.err

    Three lines for every error, with a bunch of clutter to make it hard to read. In order to get succinct log files, I had to write my own ErrorRecord converter to get back to the one line I want.

    Command argument parsing is broken. The parser will split arguments that look like "-x12.34". So if you try to pass a switch with a floating point number as part of it, the program actually receives two separate arguments: "-x12" and ".34". You have to quote the entire thing to get it passed as one argument.

    One major annoyance with cmd.exe that has not been completely fixed is the quoting inconsistencies. Nested quotes don't seem to work right. In most shells, single quotes prevent any interpretation of the string, which is mostly true in PowerShell. So when writing a quick Perl one-liner, I use single quotes to make sure any Perl variables aren't interpreted by the shell.
    perl -e '$k = "hello"; print $k'
    That seems to work, and prints 'hello'. However, try adding a newline:
    perl -e '$k = "hello\n"; print $k'
    Now Perl gives me a syntax error regarding the backslash, which leads me to believe the shell is interpreting the string before handing it to Perl. In this case, I can work around it because Perl has its own quoting operators:
    perl -e '$k = qq[hello\n]; print $k'
    But how would you pass a string like this to some other program where you needed the quotes? I couldn't figure it out.

    As a scripting environment, it's pretty nice. And like I said, it is better than cmd.exe. However, basic shell functionality is semi-broken in many ways, because of the focus on Being Innovative With Everything As An Object.

    PS. Don't forget that the escape character is ` (backquote), and not \ (backslash) like in every other shell/language.
  • by dedazo ( 737510 ) on Thursday May 03, 2007 @03:15PM (#18977139) Journal
    No, you don't understand. It's not like that at all. The I/O streams are objects themselves, and what you move through them are also qualified objects. It's just difficult to explain. I didn't "get it" either until I had one of those epyphanies like when you realize how std::vector works or how to use multiple CSS selectors or something like that. It's really cool.

    I wouldn't want to write an application with it because of the overhead, but for scripting (especially complex, stateful scripting) it just rocks.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...