Jon Johansen's used Mono and MonoDevelop to build DeDRMS, a program used to decrypt the music you purchase from iTunes Music Store.
I tried it out, and I was able to play the music I had purchased on Linux, it is really nice.
Johansen was on the #mono channel, and we got a chance to look into some of the performance issues in Rijandel. Ben noticed something interesting: plenty of our array access was done with a byte value on an array that always had at least 255 elements. So he cooked up a patch that does said ABC elimination.
A more comprehensive patch has been cooked by Massimiliano (who joined Novell today to work on Mono) which hopefully will make it before Mono Beta 1.
A series of dates for the Mono 1.0 have been posted.
Posted on 26 Apr 2004
Jeff seems to like Cringley's statement of "The central point was that paying too much attention to Microsoft simply allows Microsoft to define the game. And when Microsoft gets to define the game, they ALWAYS win."
A nice statement, but nothing more than a nice statement, other than that, its all incorrect.
Microsoft has won in the past due to many factors, and none of them related to `Let them define the game', a couple from a list of many:
In 1993-1994, Linux had the promise of becoming the best desktop system. We had real multi-tasking, real 32-bit OS. Client and Server in the same system: Linux could be used as a server (file sharing, web serving), we could run DOS applications with dosemu. We had X11: could run applications remotely on a large server, and display on small machine. Linux quickly became a vibrant innovative community, and with virtual-desktops in our window managers, we could do things ten times as fast as Windows users! TeX was of course `much better than Windows, since it focuses on the content and the logical layout' and for those who did not like that, there was always the "Andrew" word processor. Tcl/Tk was as good as building apps with QuickBasic.
And then Microsoft released Windows 95.
The concensus at that time? Whatever Microsoft is doing is just a thin layer on top of COM/DCOM/Windows DNA which to most of us means `same old, same old, we are innovating!'.
And then Microsoft comes up with .NET.
Does something like XAML matter? Not really. But it makes it simple to create relatively cute apps, by relatively newby users, in the same way anyone could build web pages with HTML.
Does Avalon really matter? Its a cute toolkit, with tons of widgetry, but nothing that we cant do on a weekend, right?
Does the fact that its built on top of .NET matter? Well, you could argue it has some productivity advantages, security features and get into a long discussion of .NET vs Java, but its besides the point.
Everyone is arguing about tiny bits of the equation `We have done that with Glade before!', `Gtk/Qt are cross-platform!', `We can get the same with good language bindings!', `We already have the widgets!', `Cairo is all we need', `What do users really want?' and of course `Dont let them define the game!'.
They are all fine points of view, but what makes Longhorn dangerous for the viability of Linux on the desktop is that the combination of Microsoft deployment power, XAML, Avalon and .NET is killer. It is what Java wanted to do with the Web, but with the channel to deploy it and the lessons learned from Java mistakes.
The combination means that Longhorn apps get the web-like deployment benefits: develop centrally, deploy centrally, and safely access any content with your browser.
The sandboxed execution in .NET [1] means that you can visit any web site and run local rich applications as oppposed to web applications without fearing about your data security: spyware, trojans and what have you.
Avalon means also that these new "Web" applications can visually integrate with your OS, that can use native dialogs can use the functionality in the OS (like the local contact picker).
And building fat-clients is arguably easier than building good looking, integrated, secure web applications (notice: applications, not static web pages).
And finally, Longhorn will get deployed, XAML/Avalon applications will be written, and people will consume them. The worst bit: people will expect their desktop to be able to access these "rich" sites. With 90% market share, it seems doable.
Will Avalon only run on Longhorn? Maybe. But do not count on that. Microsoft built IE4 for Windows 98, and later backported it to Windows 95, Windows 3.11 and moved it to HP-UX and Solaris.
The reason people are genuinely concerned and are discussing these issues is because they do not want to be caught sleeping at the wheel again.
Will this be the end of the world for Linux and the Mac? Not likely, many of us will continue using our regular applications, and enjoy our nicely usable and consistent desktops, but it will leave us out of some markets (just like it does today).
Btw, the Mozilla folks realized this already
[1] Although it was easy to see why .NET supported the Code Access Security (CAS) in .NET 1.0, there was no real use for it. With Longhorn/Avalon/XAML it becomes obvious why it was implemented.
Although some of the discussion has centered around using a native toolkit like Gtk+/XUL to build a competitor that would have ISV sex-appeal, this is not a good foundation as it wont give us Web-like deployment (we need a stack that can be secured to run untrusted applications, and we need to be able to verify the code that is downloaded, which leaves us with Java or .NET).
The time is short, Microsoft will ship Avalon in 2-3 years, and they got a preview of the technology out.
I see two possible options:
I think someone will eventually implement Avalon (with or without the assistance of the Mono team), its just something that developers enjoy doing.
If we choose to go in our own direction, there are certain strengths in open source that we should employ to get to market quickly: requirements, design guidelines, key people who could contribute, compatibility requirements and deployment platforms.
We have been referring internally at Novell to the later approach as the Salvador platform (after a long debate about whether it should be called MiggyLon or Natalon).
We do not know if and when we would staff such an effort but its on the radar.
Are there patents in Avalon? It is likely that Microsoft will try to get some patents on it, but so far there are little or no new inventions on Avalon:
Posted on 24 Apr 2004
I will be doing the keynote at the Usenix Virtual Machine Research and Technology Symposium on May 6 and 7 in California next month.
Am also attending the Usenix Conference in Boston this summer.
Nice new search engine from Amazon.
Posted on 19 Apr 2004
Greg was nice enough to point me to the PathScale compiler suite: a high performance compiler suite for AMD64. Pathscale's compiler is based on Open64, but has reportedly updated its C/C++ compiler frontends to a recent GCC (as opposed to the current Open64 derivative compilers).
OpenScale is lead by Fred Chow, an ex-SGI developer that created WHIRL, and later Pro64.
So as a starting point for a Managed C++ compiler, PathScale sources might be a better option.
This product shows one of the splits that people were afraid of to circumvent the GPL: the front-ends which are fairly complex (C++) has been split out from the backend and a proprietary highly optimizing compiler has been developed for the AMD64. It was just a matter of time before someone did this though.
Posted on 18 Apr 2004
A fresh patch and a full tarball for MCS.
I read Anthony Green's feedback on using Open64 as the foundation for implementing a Managed C++ compiler. I respectfully disagree with his opinion.
Let me explain why Open64 is the best starting point for implementing Managed C++.
First, the requirements:
Since C++ is a language of titanic dimensions, it is not one that you want to reimplement. Your best choice is to pick an existing C++ compiler. In the case of the free software community, that means gcc or any of its forks.
The question is whether GCC's internal IR can be retargetted to produce code for the stack-based CIL and whether you can propagate the extra available metadata. The later seems like a problem that we would have in both Open64 and gcc.
Now what makes Open64 interesting is the fact that we can achieve the first goal without touching GCC: C and C++ compilation would be performed with the Open64 compiler down to WHIRL and then a new generic WHIRL-to-CIL compiler is produced that generates the files we want. We do not have to mess or touch any of the existing GCC internals (it is a difficult code base to work with).
The above is very similar to IKVM's JVM to CIL compiler: the input "language" is an IR, the output language is a different IR.
The fact that Open64 does not target the x86 in a way is irrelevant, because we are not interested in targeting the x86. We are interested in targeting the CIL.
If we were to use the current GCC, we would have to intercept a good stage in the compiler, and most likely deal with RTL and produce bytecodes for CIL. The RTL is hard to penetrate and deeply linked to gcc internals. WHIRL is independent, well documented, and has various tools to process, consume and analyze the various WHIRL stages.
Finally there is the point of the FSF and the GCC maintainers refusing to make structural changes to GCC on philosophical grounds. A split that would encourage the creation of proprietary front-end and back-end systems.
Not only does this mean that its better to work with the Open64 fork of GCC which has already made this pragmatic decision and is a better foundation for targeting the CIL, but our goals are more aligned than those of the GCC maintainers.
The latest version of Open64 folded in the changes from Intel and ICT.
Posted on 15 Apr 2004
Some people think that I wrote the Mono JIT. I did not. Dietmar, Paolo, Zoltan and Martin are the people who have hacked on the JIT engine in Mono.
Recently, as a side project, I have been playing with the JIT a bit to improve some code generation here and there, and together with Ben we are learning how it works internally, it is fascinating to watch it in action.
We have been using a simple example: String.GetHashCode hoping to get things to the point where the JIT output matches GCC -O3. The code is slightly smaller than GCC, but it is wasting two registers, so in a tight loop it ends up placing a local variable on the stack which has a high impact on a tight loop.
Someone on IRC pointed me a few days ago to Open64. A compiler suite that includes C, C++ and Fortrans that generate a new intermediate representation called `WHIRL'.
Some of the compiler front-ends are the GNU compiler front ends that have been re-targeted to generate WHIRL instead of RTL. This is fascinating because it means we can now seriously consider implementing the ECMA managed extensions to C++ (the job of moving to a non-RTL IR was done by SGI).
It also means that any compiler that is moved to generate WHIRL could run on any ECMA VM.
You can get the WHIRL IR documentation from here
This topic keeps coming up: are there tools to manipulate CIL images (instrument code: pre- and post- processing, add classes, remove classes, trim methods, replace methods, append instructions) and sometime ago the fine folks at the University of Coimbra have implemented such a tool: Runtime Assembly Instrumentation Library
The other day Nat and I met David Vaskevitch, Microsoft's CTO. We went dancing.
Posted on 12 Apr 2004
Blogs are the TV of the Internet. Reading them is fun, but they are not particularly challenging.
Plenty of channels to choose from though.
Posted on 10 Apr 2004
My e-mail now consists of 40% of spam after it has gone through SpamAssassin, in the mornings or weekends it is 50% of it.
These days I use Mutt to manually remove the spam before I incorporate my e-mail into Evolution where the mail gets split in plenty of folders (and would make spam removal harder).
Is everyone else getting this same volume of trash?
Of course, the big news is that Microsoft has released an internal project called "WiX" as open source and is being hosted in SourceForge. A good move, since it gives a lot of people confidence that the license used is an open source license, without having to spend some quality time reading every sentence of a new license.
From the blog entry from one of the authors you can see that Stephen Walli was behind making this happen. I first saw Stephen doing a talk on startups and VCs at Usenix, on the early days of Ximian, and I remember taking notes frantically. Later I had a chance to meet him at OSCON in Portland.
Posted on 05 Apr 2004
As I make slow progress on Anonymous Methods on the C# compiler -due to some very embarassing assumption on my part- but most issues have been solved now. Am now implementing parameter capturing and must finish the field access reference from an anonymous method, fix a couple of regressions with iterators and commit quickly to CVS. Martin Baulig has been going through the C# generics spec item by item double-checking that his compiler has everything that we are supposed to have (Generic tests today amount to 20% of our positive tests, and 10% of our negative tests). He has finished with the generics spec as far as the ECMA working draft is concerned, but its lacking some of the latest additions that were freshly submitted in January.
Martin also worked on the new alias features in C# 2.0 and some of that code is already on the CVS tree.
Reflection.Emit is sadly not complete enough to build our generics C# compiler (even with the recently released .NET Framework 2.0 preview) so we had to resort to more hidden methods in our Reflection.Emit API which are only available to our compiler. We hope to post the list of our API needs in case Microsoft is interested in exposing these kind of changes in their implementation.
At the PDC in one of the compiler BOFs, Brad Abrams asked if people had written a compiler that followed the Common Language Specification (CLS) rules and very few people actually said they had. Although we had a C# compiler, we had not implemented the CLS rules for it.
Marek Safar recently contributed the CLS support for our C# compiler. The initial patch is about two months old, we went through various code reviews before the code could land on CVS: the traidional review for maintanability, style and performance was done before we could take his patch. Marek did a fantastic job addressing every concern that Martin and myself had. It turned out that the CLS tests have a minimal impact on compilation time.
Partial Classes is another feature that more and more people are asking about. On the surface you would think its trivial to implement it. And in fact most of it is trivial to implement, except that there is a tiny little clause in the spec that states that type lookup must be performed using the current list of using declarations. The problem is that the list of using declarations might be different from file to file, so in two different pieces of a partial class "MyType" might have a different meaning.
Partial classes are debatable an ugly hack, but at least this rules make sense in terms of doing what the programmer expects.
Implementation wise, it means that we need to do a lot of code shuffling to implement it, and it is not something that we are likely going to have by Mono 1.0.
Posted on 04 Apr 2004
Erik pointed me to this nice writeup on the usability of Gnome given recent debates on the topic.
Posted on 03 Apr 2004
I am counting the minutes for Sun to ship our Mono implementation for Solaris. Maybe we can still make it to the Solaris 10 release.
Just picture the benefits, out of the box free C# compiler on Solaris SPARC and Solaris Intel. Out of the box ASP.NET and ADO.NET on SPARC, and the Gtk# bindings for writing applications for the Java Desktop System.
Not to mention that they get the industry's most sexy JIT compiler for free.
I am walking with an extra cell phone battery in case McNealy or Schwartz decide to call me up over the weekend to discuss potential agreements (if I dont pick up, please leave a message, the wonders of ATT wireless).
Posted on 02 Apr 2004