Buck Stopping and the White House

by Miguel de Icaza

In his Open Letter to the President Ralph Nader points out that although Bush admitted "mistakes were made", he quickly moved on, and changed the subject:

You say "where mistakes have been made, the responsibility rests with me." You then quickly change the subject. Whoa now, what does it mean when you say the responsibility for mistakes rest with you?

Responsibility for "mistakes" that led to the invasion-which other prominent officials and former officials say were based on inaccurate information, deceptions, and cover-ups?

Responsibility for the condoning of torture, even after the notorious events at abu-Gharib prison were disclosed?

Responsibility for months and months of inability to equip our soldiers with body armor and vehicle armor that resulted in over 1,000 lost American lives and many disabilities?

Responsibility for the gross mismanagement over outsourcing both service and military tasks to corporations such as Haliburton that have wasted tens of billions of dollars, including billions that simply disappeared without account?

Responsibility for serious undercounting of official U.S. injuries in Iraq-because the injuries were not incurred in direct combat-so as to keep down political opposition to the war in this country?

Click to read the rest of the letter.

Posted on 21 Jan 2007

Crap in Vista, part 2

by Miguel de Icaza

Microsoft has posted some answers to questions regarding the Content, Restriction, Annulment and Protection (or CRAP, some people euphemistically call it "Digital Rights Management") features built into Vista. I previously blogged about Peter Gutmann's explanation of the problems that it would cause Vista.

Microsoft has responded to these claims here.

The response does little to contradict Peter's findings, and merely tries to shed some positive light on things. They answer questions in the best style of Ari Fletcher; for example, when it comes to cost, they conveniently ignore the question and instead focus on the manufacturing benefits (!!!).

It is interesting to compare this with the presentation by ATI at Microsoft's Hardware conference in 2005 (WinHEC), Digital Media Content Protection which has a more honest look at the costs all throughout the stack.

The discussion at Slashdot browsed at +4 has the good stuff.

Posted on 21 Jan 2007

Sansa Connect Video

by Miguel de Icaza

You can see the UI of the Sansa Connect in action in this Engadget video:

It apparently will also let you browse your Flickr (wonder if they use the Flickr.Net API) and share songs with your friends, and it seems to have some sort of last-fm like listening mode.

The details on the song sharing are not very clear, it all seems bound to the service. But I wonder if you can pass around your own mp3s.

Anyways, am buying this on the grounds that it runs Mono, and so I can finally show a physical object to the family that runs it.

Posted on 18 Jan 2007

Mono-based device wins Best-of-Show at CES

by Miguel de Icaza

The SanDisk Sansa Connect MP3 player won the Best of Show award in the MP3 and Audio category:.

The Sansa Connect is running Linux as its operating system, and the whole application stack is built on Mono, running on an ARM processor.

There is a complete review of it at Engadget, among the features it includes:

  • 4GB of memory.
  • SD slot for extra storage.
  • WMA, MP3 subscription WMA and PlayForSure are supported.
  • Internet Radio streaming.
  • WiFi.
  • Photo browser.
  • 2.2 TFT color screen.

The WiFi support allows users to download music from providers or their own home servers.

The Sansa Connect is designed by the great guys at Zing and it will be available to purchase in a couple of months.

Posted on 17 Jan 2007

Not a Gamer

by Miguel de Icaza

With all the rage and discussion over the PS3 vs the Wii and how the Wii is a breakthrough in interactivity, I decided to get myself one of those, and auctioned my way to one on eBay.

The last time I owned a console it was an Atari 2600, and I barely got to play with it.

Wii Sports is a very nice program, and I enjoy playing it. As reported, your arms are sore the next day after playing Wii Tennis.

I went to the local game store and bought some assorted games for the Wii, the clerk said "excellent choices sir", as if I was picking a great cheese at Formaggio Kitchen.

The graphics are amazing, but I could not stand playing any of them. The Zelda graphics are incredibly well done.

But all I could think of was the poor guy in QA that must have debugged been filing bugs against this. Man, do I feel sorry for the QA guys that do games.

But I just do not feel like solving Zelda. Am sure it must have some kind of appeal to some people, but solving puzzles in a 3D world and shooting at stuff and earning points did not sound like an interesting enough challenge. I can appreciate the cuteness of having to find the cradle in exchange for the fishing pole and shooting rocks and "zee-targeting" something.

As far as challenges go, to me, fixing bugs, or writing code is a more interesting puzzle to solve. And for pure entertainment, my blogroll and reddit provide a continuous stream of interesting topics.

Am keeping the Wii though. Playing Wii box and Wii tennis with Laura is incredibly fun (well, watching Laura play Wii Box accounts for 80% of the fun).

When I got the Wii, I told myself "If I like this, am getting the PS3 and the XBox". Well, I actually just thought about it, I do not really talk to myself.

But am obviously not a gamer.

Posted on 11 Jan 2007

Programmer Guilt

by Miguel de Icaza

With Mono, is has often happened that I have wanted to work on a fun new feature (say, C# 3) or implement a fun new class.

But most of the time, just when am about to do something fun for a change, I think "don't we have a lot of bugs out there to fix?", so I take a look at bugzilla and try to fix them, follow up on some bugs, try to produce test cases.

By the time am done, I have no energy left for the fun hack.

I need some "guilt-free" time, where I can indulge myself into doing some work that is completely pointless. But there is a fine balance between happy-fun-fun hacking, and making sure that Mono gets a great reputation.

Posted on 11 Jan 2007

Mono and C# 3.0

by Miguel de Icaza

Since a few folks have asked on irc and they are not using my innovative comment-reading service, am reposting (with some slight edits and clarifications) the answer to "Will Mono implement C# 3.0?"

Yes, we will be implementing C# 3.0.

We have been waiting for two reasons: first, we wanted to focus on bug fixing the existing compilers to ensure that we have a solid foundation to build upon. I very much like the Joel Test that states `Do you fix bugs before writing new code?'.

C# 3.0 is made up of about a dozen new small features. The features are very easy to implement but they rely heavily on the 2.0 foundation: iterators, anonymous methods, variable lifting and generics.

Since we have been fixing and improving the 2.0 compiler anyways, we get to implement the next draft of the specification instead of the current draft. This means that there is less code to rewrite when and if things change.

Fixing bugs first turned out to be a really important. In C# 3.0 lambda functions are built on the foundation laid out by anonymous methods. And it turned out that our anonymous method implementation even a few months ago had some very bad bugs on it. It took Martin Baulig a few months to completely fix it. I wrote about Martin's work here. The second piece is LINQ, some bits and pieces of the libraries have been implemented, those live in our Olive subproject. Alejandro Serrano and Marek Safar have contributed the core to the Query space, and Atsushi did some of the work on the XML Query libraries. We certainly could use help and contributions in that area.

Anecdote: we started on the C# 2.0 compiler about six months before C# 2.0 was actually announced at the 2003 PDC. Through ECMA we had early access to the virtual machine changes to support generics, and the language changes to C#. By the time of the PDC we had almost a functional generics compiler.

The spec was not set in stone, and it would change in subtle places for the next two years. So during the next two years we worked on and off in completing the support and implementing the changes to the language as it happened.

Most of the hard test cases came when C# 2.0 was released to the public as part of .NET 2.0. About six weeks into the release (mid-December and January) we started receiving a lot of bug reports from people that were using the new features.

Second Mini-Anecdote: Pretty much every new release of IronPython has exposed limitations in our runtime, our class libraries or our compilers. IronPython has really helped Mono become a better runtime.

Posted on 11 Jan 2007

Keith Olberman Evaluation of the Iraq War

by Miguel de Icaza

Just minutes before the speech last night, Keith Olberman had a quick recap of the mistakes done so far.

Loved the delivery. Crooks and Liars has the video and the transcript (the video is better, as it has a moving sidebar with the summary, like Colbert's "The Word").

Posted on 11 Jan 2007

Functional Style Programming with C#

by Miguel de Icaza

C# 3.0 introduces a number of small enhancements to the language. The combination of these enhancements is what drives the LINQ.

Although much of the focus has been on the SQL-like feeling that it gives the language to manipulate collections, XML and databases in an efficient way, some fascinating side effects are explored in this tutorial.

The tutorial introduces the new features in C# one by one, there are a couple of interesting examples, a simple functional-style loop:

// From:
for (int i = 1; i < 10; ++i) Console.WriteLine(i);

// To:
Sequence.Range(1, 10).ForEach(i => Console.WriteLine(i));

A nice introduction to delayed evaluation, an RPN calculator:

// the following computes (5*2)-1
Token[] tkns = {
    new Token(5),
    new Token(2),
    new Token(TokenType.Multiply),
    new Token(1),
    new Token(TokenType.Subtract)

Stack st = new Stack();

// The RPN Token Processor
    i => (int)i.Type,
    s => st.Push(s.Operand),
    s => st.Push(st.Pop() + st.Pop()),
    s => st.Push(-st.Pop() + st.Pop()),
    s => st.Push(st.Pop() * st.Pop()),
    s => st.Push(1/st.Pop() * st.Pop())


And finally a section on how to parse WordML using LINQ, extracting the text:

	wordDoc.Element(w + "body").Descendants(w + "p").
		Select(p => new {
        		ParagraphNode = p,
        		Style = GetParagraphStyle(p),
        		ParaText = p.Elements(w + "r").Elements(w + "t").
				Aggregate("", (s1, i1) => s1 = s1 + (string)i1)}).
		Foreach(p =>
			 Console.WriteLine("{0} {1}", p.Style.PadRight(12), p.ParaText));

Very educational read.

Posted on 10 Jan 2007

SecondLife: Cory Pre-Town Hall Answers

by Miguel de Icaza

After the release of the SecondLife client as open source software, Cory has a pre-town hall answers post.

The SecondLife client is dual licensed under the GPL and commercial licenses.

Regarding the use of Mono, Cory states:

Open Sourcing the client does not impact continued development of Mono or other planned improvements to LSL. Although Mono is also an Open Source project, that code will be running on the server, not the client. As has been previously discussed, part of the reason for going to Mono and the Common Language Runtime is to eventually allow more flexibility in scripting language choice.

Cory explains some of the rationale for open sourcing the client:

Third, security. While many of you raised question about security, the reality is that Open Source will result in a more secure viewer and Second Life for everyone. Will exploits be found as a result of code examination? Almost certainly, but the odds are far higher than the person discovering the bug is someone working to make Second Life better. Would those exploits have been found by reverse engineering in order to crack the client? Yes, but with a far smaller chance that the exploit will be made public. Also, as part of the preparation for Open Source, we conducted a security audit and took other precautions to mitigate the security risks.

Fourth, as we continue to scale the Second Life development team --- and thank you to the many folks who have helped to get our hiring pipeline humming again --- Open Source becomes a great way for potential developers to demonstrate their skill and creativity with our code. Moreover, it makes it even easier for remote developers to become part of Linden Lab. The possibility space for Second Life is enormous, so the more development horsepower we can apply to it --- whether working for Linden Lab or now --- the faster we all can take Second Life from where it is today into the future.

And also, a new book on SecondLife is coming out.

Posted on 10 Jan 2007

« Newer entries | Older entries »