Cross Platform Packaging: A Dream Or Something More? 87
stevenl writes "A new project on sourceforge has just been set up for a cross-platform packaging standard. Whilst there isn't much there at the moment, plans are to produce a standard that will allow people to use it even if they have no binary utilities or a compiler to compile one with, and it's expected to be platform independent whilst still being lightweight. What's people's opinions of the cross-platform aspect taking off, or will we see another situation like we have with DPKG - great packaging sysetm, but not widely used due to the inferior (but still good) RPM and proprietary things like installshield?" Frankly, apt-get [?] does just about everything that I need - but I'm curious as to what people about something like this actually working - is it a pipe dream? Or possible?
Installshield blows. (Score:1)
Makefiles are handled for you if you have a half decent IDE. Windows has many good IDEs.
Re:Yes, because OS's are becoming irrelevant. (Score:1)
Re:Ahem. (Score:1)
Anyone else notice ... (Score:1)
There's not even anything there to download yet ! News for nerds I guess ..
Dependency checking (Score:1)
Now, I said that this shouldn't be too difficult to do. But that's not the case, now then is it? Ok, so we've got these libraries. Cool. We know where the binaries are, right? Yup, but what version are they? Ok, while there are some version numbers built into the names of the libraries (you may have to look at some symbolic links, but nothing too difficult), but what about programs? How do we have any idea what version they are?
I'm sad to say that Microsoft solved this problem a long time ago by allowing the integration of version information into the string tables of its executables and libraries. This is the standard way of handling things. Now, sure, you could do the same thing in Unix, but nobody does. Or at least nobody with any influence. Now, sure, you can query a binary for its version information, but which flag is it again? --version, right? Or is it -version? Err.. -V, no, that's not right. -v. Crap, that's 'verbose'. Well, which is it?
The sad truth is that this just isn't feasible right now. Its going to take a lot more than a project with good intentions to get people to start putting version information into their binaries, and even when they do, there's whole lot of people out there that have old binaries and will see absolutely no need to purchase an update to their OS (yup, some people buy their Unix, like Irix, Solaris, etc.) just to get version information put into their binaries. Its going to take even more than a good idea to get companies like Sun and SGI to recompile all of their code and change all of their Makefiles just to take advantage of these new whiz-bang version features.
Now, don't get me wrong. I think that the ability to maintain "packages" without the need of a database would be wonderful. I've dealt with my fair share of RPM headaches caused by taking the rode less traveled and compiling things from scratch. But I think that before we develop a new packaging system, there are other more important problems that need to be addressed first. Attack the problem at the root, don't go for the branches or you'll never win any ground.
Re:The problem is in the dependency database (Score:2)
It's not perfect (because of the fact that the things you install don't auto-handle dependencies) but it gives you a nice package style view.
For example, to install IBMs JDK on my box, I just stuck it in
And stow then symlinks the files etc straight into
It stops cruddification of
The best bet would be something like this but to generate packages - if I could pack something into
Re:Wouldn't it be cool? (Score:2)
Or not. I don't necessarily install only source packages, even on BSD or Linux...
...and, from what I see on the Ethereal mailing lists, I'm not the only one; there are plenty of people who install binaries of Ethereal, for example.
Re:Wouldn't it be cool? (Score:4)
I don't know whether you'd ever get all the *NIXes to adopt one package format (heck, not even all of the *NIXes that use the Linux kernel use the same package format, so far; is it even possible to generate an RPM that works, for a given instruction-set architecture, on all distributions that use RPM?).
It's probably even more unlikely that you'd get Windows, or MacOS Classic (MacOS X might be considered "one of the *NIXes", although it may be different enough from other *NIX-flavored OSes that it'd be even less likely that it'd adopt some standard package format).
It might be possible to have tools such as Easy Software Product's Package Manager [easysw.com] (as mentioned in another posting; ESP are the folks who do CUPS) work with various non-*NIX packaging tools, as well as handling the various *NIX package formats it now handles (debs, RPMs, SVR4 packages, IRIX packages of some sort, HP-UX packages of some sort, source tarballs).
Some tools for packaging on Windows include MindVision's Installer VISE [mindvision.com] (available for Windows and MacOS), for which "qualifying shareware and freeware developers" can get a free license [mindvision.com] (it's what the GTK+ and GIMP for Windows [user.sgic.fi] uses), and Nullsoft's "SuperPimp" Install System [nullsoft.com], which is also free. (I've not used either of them, so I can't say how good or bad they are.)
Well, there's a tool called nmake, which comes as part of a package called "Visual C++" [microsoft.com] from some company up in Redmond, Washington that has done some software for Windows [microsoft.com]; its makefiles aren't exactly like those for the various *NIXes (but those aren't all the same, either - you have System V make, Sun's make which is a superset of SV make, GNU make, Berkeley make, etc.).
It's not clear that it's a package manager's job to deal with the differences between the "make"s on various platforms.
Here is a URL (Score:2)
http://www.easysw.com/epm/index.html [easysw.com]
Want to make $$$$ really quick? It's easy:
1. Hold down the Shift key.
Why reinvent the wheel? It has already been done (Score:4)
Want to make $$$$ really quick? It's easy:
1. Hold down the Shift key.
Re:Wouldn't it be cool? (Score:2)
EPM, posted earlier, looks pretty good for the Unix world, since it already supports quite a few platforms such as Debian, Red Hat, Solaris, HP-UX and Irix. This is unlikely to ever cover Windows - Microsoft has its own recommended installer format (Windows Installer,
Re:The problem is in the dependency database (Score:2)
There are incomplete versions of this (Score:1)
NeXTStep (now MacOS X) has "obese" binaries that can support lots of different architectures. However, I don't know if the packages that NeXTStep uses support the execution of non-trivial code at install time. Also, I don't know if obese binaries were parseable by Mach-O based implementations and those where OpenStep was just a library (like the Win32 version).
Probably the most portable "execution" formats that can be understood across platforms are HTML, Javascript, and Java. You could distribute an installer that was just a web page, used Javascript to sniff the platform, then executed the installer for the right platform. (This would be considered a privileged operation and the user might be asked to confirm it, and the applet or script snippet that was implementing this would have to have its code signed.) Check out InstallAnywhere [zerog.com], which uses some of these techniques (at least the last time I checked, anyway).
Yea! (Score:3)
Re:Yes, because OS's are becoming irrelevant. (Score:1)
Client machines
---------------
- AMD 1.2 Gz
- super-fast graphics
- 1GB RAM
- ATM interface
- 0 disk
Client is basically a hot-rod x-terminal with a big fast pipe. No hard disk, only 1 very large ram disk. Files would be accessed through NFS. All programs would be run off of the application server(s), or the web.
Programs could be cached on the ramdisk. The machine would never be powered off, so its cache would become rich with programs and data.
If eventually the OS crashes, you'd restart, and have a virgin machine.
Server machine
--------------
Sky's the limit
The high-speed networking of tomorrow will make a great many things possible.
domc
Re:so (Score:1)
domc
Re:Yes, because OS's are becoming irrelevant. (Score:1)
A small network could have one or two application servers.
A network of a million users could have a separate server(s) for each app.
domc
Re:Yes, because OS's are becoming irrelevant. (Score:1)
Diskless workstations would make the most sense on a large corporate network where there many desktops to maintain. In such an organization there is little need for local storge.
domc
It's not about web software at all... (Score:1)
Re:how good is? (Score:1)
Package wars non-productive (Score:1)
Second, these package wars are non-productive. Saying that RPM is inferior to DPKG does not help anyone. Especially when it's not true. Neither RPM or DPKG are significently superior to each other. When you break them down they are pretty much the same, just with some minor differences in implimentation. They are both very good tools.
---
MSI (Score:1)
If they could get something that would reliably install stuff under Win2K (InstallShield really doesn't cover it)
Microsoft is now releasing most of their software (Office 2k, the new Visual Studio.NET) in
[...] and do compiling for makefiles (I don't even know if there is something to do makefiles in Windows anyway), I'd definitely get this package manager.
Go and grab one the hundreds of projects that compile under both Win32 and UNIX. You'll notice that they use separate makefiles for Win32 - autoconf is not required (assuming you were talking about autoconf in "compiling for makefiles").
Re:The problem is in the dependency database (Score:1)
In each app directory a file could be placed by the maintainer (lets call it appinfo) this could be a simple perl hash or an xml document describing the package and the dependencies as well as the home page of the document. Here is simplified example. If you download the source yourself and install it yourself then encap can still make the links for you and the pakager will know about it simply by reading the
name = myapp
version = 1.001
url = http://myapp.com/installer/
depends = { name = somelib,version=2.001, url = http://someurl }
depends = {name = anotherlib, version=3.02
Like I said the transport mechanism could be httpsync [mibsoftware.com]
The encap (or like) program could simply invoke httpsync and download the package, check for dependencies and keep calling httpsync to install the dependencies if they don't exist.
The beauty of it all is that the work is already done. Httpsync as well as CVSUP, rsync etc already exist all you need is a relatively simple perl script and you are done.
The tricky part is to get maintainers who are willing to compile the app and serve it using whatever.
Re:The problem is in the dependency database (Score:2)
1) Every application must reside in it's own directory in
2) The directory must be named nameofapp.ver
3) Inside the app directory ther may exist
4)
5) all the files are them symlined to
The beauty of this system is that it's very easy to implement because it's just a matter of specifying a --prefix.
If you need to roll back to a previous version you simply relink the old directory.
You don't need a database just doing a ls on the
You don't need to compile in fact you can use httpsync or cvsup to fetch the files directly from a url.
Easy, simple, human readable, human fixable what else can you ask for.
Re:The problem is in the dependency database (Score:1)
If I remember correctly, the AT&T 3b2 used to do this back in the System 5 R 2 days but its packages were based on cpio.
Anyone remember ANDF ? (Score:1)
Re:Yes, because OS's are becoming irrelevant. (Score:1)
Not for me. No thankyou. As cheap as hard drives are getting, you want me to use bandwidth rather than have a copy of vi on my system?
Re:Yes, because OS's are becoming irrelevant. (Score:1)
Re:Yes, because OS's are becoming irrelevant. (Score:1)
Re:Yes, because OS's are becoming irrelevant. (Score:2)
Re:You 'R a dumbass. [nt] (Score:1)
As for release dates, you should have come to expect open-source-style lack-of-release-dates by now. What he "announced" was a *projection*, not a promise. That's why it's different from when Microsoft misses a release date.
I don't agree with everything Linus does either, but if you're going to complain, say something valid.
--------
Genius dies of the same blow that destroys liberty.
Re:Wouldn't it be cool? (Score:2)
Re:zip file? (Score:1)
The problem is that oftentimes these files are not ZIP files, or, worse, are a proprietary installer program. Then you can't really access it under Linux at all.
Since the most recent versions of Windows have built-in support for ZIP files, it shouldn't be necessary any longer to distribute self-extracting files.
Re:zip file? (Score:1)
irony (Score:1)
Consider the tar and cpio factions, and "pax", the faction to end all (archiver) factions.
Whilst (Score:1)
Release Engineering/Configuration Management (Score:1)
I'll check out the project, perhaps I'll offer assistance, but I won't bet my career on it.
Re:whats wrong (Score:1)
Installshield Java Edition (Score:1)
Re:Yes, because OS's are becoming irrelevant. (Score:1)
Re:Can Hemo Handle Simple English? (Score:1)
need I point out:
stevenl writes "A new project...
The post was written by the person of the name signified in bold.
Why not... (Score:2)
Package two files as one. One file is the encoded source code (for closed source) or just a tgz with all the source (for open source) we'll call this Part A. Then add Part B, an operating system emulator that is made to do only a single thing: run a fake os, very small, and execute a compiler that is built into the emulator to compile the source and then spit out the final binary onto the system...
For instance. Very simple hypothetical example.
I write a program called "DecBin" that converts decimal numbers to binary. I take the source files and put them in a
Now whenever somebody wants to install my program, they execute the binary. The binary unpacks the tgz and then runs its emulator/compiler on the source code and then spits out the binary as a.out or something.
Just an idea...
Re:Wouldn't it be cool? (Score:2)
Open Packages (Score:1)
How does this relate to Open Packages [openpackages.org]?
I know that Open Packages is a Unified BSD Package Collection, but how do these compare?
Lowest Common Demoninator (Score:3)
Also, the executables that would get distributed will likely be tailored to its host platform. Many programs will utilize the differences between these platforms. Again, I bring up the example of the Windows Registry. Many applications in Windows depend upon the installer setting up Registry keys that are accessed by the executable. How do you rectify this in *nix? Or the MacOS? Or with the ever growing number of embedded apps?
I'm afraid that program placement, management, configuration, etc. hits so close to the core of what makes a platform that this project will be difficult to complete (and be useful).
Here's a metric we can use to see if ever succeeds: Will developers throw away InstallShield and rpm to use this?
Re:Wouldn't it be cool? (Score:1)
Any operating system can have a utility for compiling projects with makefiles. A makefile is, after all, just a set of instructions to be passed along to the compiler about how to compile (what options to use, etc.) the project. Any decent C/C++ compiler will have a make utility (for Windows/DOS, DJGPP [delorie.com] comes to mind).
Re:Yes, because OS's are becoming irrelevant. (Score:1)
Re:RPM compatibility (Score:1)
Only UNIX platforms? (Score:2)
Also, how would it handle dependancies? A widely available version of X11 for BeOS, for example, puts files in different places than X11 for Linux does (AFAIK).
This doesn't sound feasable for anything other than a strictly UNIX platform, and then what do we have? Yet Another Packaging System.
how good is? (Score:1)
Styrofoam Peanuts! (Score:4)
Re:Ahem. (Score:1)
Shouldn't worry about it, someone else moderated me down as flamebait.
--
zip file? (Score:1)
Sun's Java Web Start... (Score:2)
Re:I hate cross-platform packaging. (Score:1)
OpenPackages.org (Score:2)
--
Portage will be out next month (Score:1)
apt-get? (Score:2)
Anyway, if you can't assume access to a compiler, the only thing you could do would be to ship binaries for every platform. And as you can probably tell, that would suck. I would say just use java and
Of course, that would require that you have binaries for that platform, but, its better then nothing.
Amber Yuan 2k A.D
Re:The problem is in the dependency database (Score:1)
You can do away with much of the symlinking by updating the tools.
Eg, have a help command which simply looks inside the 'Help' subdirectory of the application (in fact, the filer just opens that directory).
Likewise, you can make your shell 'run' a directory by running the file 'AppRun' inside.
You can even include the source code and make AppRun a shell script that compiles a binary for your platform automatically if it doesn't yet exist.
Also, this means that the source, help and binaries never get out of sync!
--
Re:RPM compatibility (Score:1)
While it might be true that RPM is used a lot more than DEBs, that doesn't mean Debian users are just going to abandon their packaging system. In fact, Debian users are probably more religious about it. And they're probably right too, DEBs work by far better and the infrastructure is already in place.
So, while it might sound like a great idea to extend RPM to do DEB-like stuff it is too late (assuming that it is technically possible).
A solution using Perl... (Score:1)
The biggest issue with RPM we find is that everyone who builds them seems to think that every program should be placed in some set of hardwired directories. To provide flexibility for the user to choose their own installation root (or perhaps even install the software multiple times on the same machine for different purposes we found RPMs sorely lacking.
Installation can be done by command line, remotely, and by someone without root privs in a directory where the user has write access and provide apache, perl, php and mysql function for that user.
To gain widespread IT acceptance, open source products are going to need to get much more sophisticated about how and where they are installed.
Britt...
The problem is in the dependency database (Score:5)
The problem is, inevitably, the database will get out of sync the moment you have to compile something from source because no
Once the database is out of sync, then new problems come up, and those are easily fixed by forcing an install or installing from source, and then it just gets worse.
Without a database, it would mean the installer would have to have a way to detect whether the dependent thing is installed or not, and in the correct version. I won't say that would be easy, but it is what would be needed. Until then, based on my past experiences with Redhat's RPM, I won't at all be interested in a fancy packaging system.
Wouldn't it be cool? (Score:2)
If they could get something that would reliably install stuff under Win2K (InstallShield really doesn't cover it), and do compiling for makefiles (I don't even know if there is something to do makefiles in Windows anyway), I'd definitely get this package manager.
Great idea. (Score:2)
I like the idea of not having to know how to make an HP Swinstall package, a Sun pkgadd package, and others. Right now, I need to sholder tap someone to have a package made for a packaging system I am unfamilier with.
As for Windows flavors... I find myself wondering why even port anything to Windows... let alone package it. But of course, that is entirely biased.
Re:I hate cross-platform packaging. (Score:1)
It's not about packaging for as many OS's as possible in one package, but to create many packages for many OS's/distributions from a single cvs repository with a single tool.
Re:Portage will be out next month (Score:1)
I have yet to see a packaging system... (Score:1)
... that doesn't punt on managing multiple versions of the same "thing" and multiple parallel instances of the same thing running at the same time, with all the runtime configuration problems that go with it.
For example: it is impossible to create a package of, say, a sophisticated perl module without having to effectively include a whole perl distribution with it. Why? Because it is impossible to have two different versions of the same module available at the same time, at least not without serious trickery
These are the really hard problems, and as long as they are being ignored, this new packager is just so much wasted effort.
Not that hard really, but all must participate! (Score:3)
Re:A solution using Perl... (Score:1)
--
Re:zip file? (Score:1)
I hate cross-platform packaging. (Score:4)
This is one of the reasons why I hate VIA: because they do everything so bass-ackwards.
horrible idea (Score:2)
Re:Yes, because OS's are becoming irrelevant. (Score:1)
Just a thought...
-Bucky
The few, the proud, the conservative.
Re:Yes, because OS's are becoming irrelevant. (Score:3)
There's more programs that couldnt/shouldnt be run from the web than ones that should. Can you imagine trying to run Quake over the web? What about other processer intensive applications?(seti@home, apache ect...). I dont see a day when people will give up performance/security just so that there can be a unified OS......I just cant see it.
-Bucky
The few, the proud, the conservative.
Re:Portage will be out next month (Score:1)
The BSD people stick with the "tried and true" librarys that work, and work they do, but they dont get all the new features that revamping those librarys can bring. Maybee you dont want those features, good, stick with BSD.
For those of us that like new features, we will stick with linux, and its ever expanding set of features. The programs that tend to "not work across distributions" are those compiled with the higher level API stuff anyways. Core API's all stay the same and are very compatable across
distributions.
Maybee we should call distributions what they truly are, OS's with different versions of the exact same librarys, except that would be a crappy name, so howabout we call them distributions . . .
Re:Yes, because OS's are becoming irrelevant. (Score:1)
KTB:Lover, Poet, Artiste, Aesthete, Programmer.
Yes, because OS's are becoming irrelevant. (Score:5)
The thing that people forget is that this is a good thing. What Linux needs is to develop a cross platform packaging system such as this so that the web can utilise it, and so that the Linux system is at the centre just when these new developments are taking off. The future is OS independant. If Linux is to survive in such a world, it needs to be independant too.
KTB:Lover, Poet, Artiste, Aesthete, Programmer.
But this begs the question (Score:1)
OO (Score:2)
You know, if you are going to imposter-ize me, you should wear ankle guards, because when I find you I am going to unleash an army of rabid she-gnomes on you.
-perdida
RPM compatibility (Score:1)
Interesting..... (Score:1)
Even nicer... (Score:1)
Re:whats wrong (Score:1)
Re:Lowest Common Demoninator (Score:1)
Re:Wouldn't it be cool? (Score:1)
Re:Wouldn't it be cool? (Score:1)
Also, I agree with you about Linux. I would actually prefer dl'ing precompiled applications. But if the source is all that is included it should still be easy for me to install.
Greg
Re:The problem is in the dependency database (Score:1)
I have used an NFS capable variation of /s from the UW Madison CS department. [wisc.edu]
At one of my sites I setup A set of tools not unlike stow [gnu.org] and graft [gormand.com.au] that would build sets of software for anyone to use. The set of tools would automaticly reconfigure users enviroment like encap (can't remeber were that is from). It would however do it in the filesystem so that you could appropreately control the revisions or toolset that a scritpt was coded to use. A.K.A. #!/home/gulfie/u/project_uts/bin/perl -w
It is not a packaging system as such, it is more of a software installation system, but a packaging system on top of this would be almost trivial... I like trivial it is more likly to be gotten correct.