Last night Cristina Kirchner, Argentina's President, live tweeted the events around Evo Morales' European airplane hijacking:
Posted on 03 Jun 2013
For many years, I have learned various subjects (mostly programming related, like languages and frameworks) purely by reading a book, blog posts or tutorials on the subjects, and maybe doing a few samples.
In recent years, I "learned" new programming languages by reading books on the subject. And I have noticed an interesting phenomenon: when having a choice between using these languages in a day-to-day basis or using another language I am already comfortable with, I go for the language I am comfortable with. This, despite my inner desire to use the hot new thing, or try out new ways of solving problems.
I believe the reason this is happening is that most of the texts I have read that introduce these languages are written by hackers and not by teachers.
What I mean by this is that these books are great at describing and exposing every feature of the language and have some clever examples shown to you, but none of these actually force you to write code in the language.
Compare this to Scheme and the book "Structure and Interpretation of Computer Programs". That book is designed with teaching in mind, so at the end of every section where a new concept has been introduced, the authors have a series of exercises specifically tailored to use the knowledge that you just gained and put it to use. Anyone that reads that book and does the exercises is going to be a guaranteed solid Scheme programmer, and will know more about computing than from reading any other book.
In contrast, the experience of reading a modern computing book from most of the high-tech publishers is very different. Most of the books being published do not have an educator reviewing the material, at best they have an editor that will fix your English and reorder some material and make sure the proper text is italicized and your samples are monospaced.
When you finish a chapter in a modern computing book, there are no exercises to try. When you finish it, your choices are to either take a break by checking some blogs or keep marching in a quest to collect more facts on the next chapter.
During this process, while you amass a bunch of information, at some neurological level, you have not really mastered the subject, nor gained the skills that you wanted. You have merely collected a bunch of trivia which most likely you will only put to use in an internet discussion forum.
What books involving an educator will do is include exercises that have been tailored to use the concepts that you just learned. When you come to this break, instead of drifting to the internet you can sit down and try to put your new knowledge to use.
Well developed exercises are an application of the psychology of Flow ensuring that the exercise matches the skills that you have developed and they guide you through a path that keeps you in an emotional state ranging that includes control, arousement and joy (flow).
Back in 1988 when I first got the first edition of the "C++ Language", there were a couple of very simple exercises in the first chapter that took me a long time to get right and they both proved very educational.
The first exercises was "Compile Hello World". You might think, that is an easy one, I am going to skip that. But I had decided that I was going to do each and every single of one of the exercises in the book, no matter how simple. So if the exercise said "Build Hello World", I would build Hello World, even if I was already seasoned assembly language programmer.
It turned out that getting "Hello World" to build and run was very educational. I was using the Zortech C++ compiler on DOS back, and getting a build turned out to be almost impossible. I could not get the application to build, I got some obscure error and no way to fix it.
It took me days to figure out that I had the Microsoft linker in my path before the Zortech Linker, which caused the build to fail with the obscure error. An important lesson right there.
The second exercise that I struggled with was a simple class. The simple class was missing a semicolon at the end. But unlike modern compilers, the Zortech C++ compiler at the time error message was less than useful. It took a long time to spot the missing semicolon, because I was not paying close enough attention.
Doing these exercises trains your mind to recognize that "useless error message gobble gobble" actually means "you are missing a semicolon at the end of your class".
More recently, I learned in this same hard way that the F# error message "The value or constructor 'foo' is not defined" really means "You forgot to use 'rec' in your let", as in:
let foo x = if x == 1 1 else foo (x-1)
That is a subject for another post, but the F# error message should tell me what I did wrong at a language level, as opposed to explaining to me why the compiler is unable to figure things out in its internal processing of the matter.
Nowadays we are cranking books left and right to explain new technologies, but rarely do these books get the input from teachers and professional pedagogues. So we end up accumulating a lot of information, we sound lucid at cocktail parties and might even engage in a pointless engineering debate over features we barely master. But we have not learned.
Coming up with the ideas to try out what you have just learned is difficult. As you think of things that you could do, you quickly find that you are missing knowledge (discussed in further chapters) or your ideas are not that interesting. In my case, my mind drifts into solving other problems, and I go back to what I know best.
Please, build exercises into your books. Work with teachers to find the exercises that match the material just exposed and help us get in the zone of Flow.
Posted on 25 Apr 2013
Non-government controlled currency systems are now in vogue. Currencies that are not controlled by some government that might devalue your preciously earned pesos at the blink of an eye.
BitCoin is powered by powerful cryptography and math to ensure a truly digital currency. But it poses significant downsides, for one, governments can track your every move, and every transaction is stored on each bitcoin, making it difficult to prevent a tax audit in the future by The Man.
Today, I am introducing an alternative currency system that both keeps the anonymity of your transactions, and is even more secure than the crypto mumbo jumbo of bitcoins.
Today, I am introducing the MigCoin.
Like bitcoins, various MigCoins will be minted over time, to cope with the creation of value in the world.
Like bitcoins, the supply of MigCoins will be limited and will eventually plateau. Like bitcoin, the MigCoin is immune to the will of some Big Government bureaucrat that wants to control the markets by printing or removing money from circulation. Just like this:
Projected number of Bitcoins and MigCoins over time.
Unlike bitcoins, I am standing by them and I am not hiding behind a false name.
Like BitCoins, MigCoins come with a powerful authentication system that can be used to verify their authenticity. Unlike BitCoins, they do not suffer from this attached "log" that Big Brother and the Tax Man can use to come knocking on your door one day.
How does this genius of a currency work? How can you guarantee that governments or rogue entities wont print their own MigCoins?
The answer is simple my friends.
MigCoins are made of my DNA material.
Every morning, when I wake up, for as long as I remain alive, I will spit on a glass. A machine will take the minimum amount of spit necessary to lay down on a microscope slide, and this is how MigCoins are minted.
Then, you guys send me checks, and I send you the microscope slides with my spit.
To accept MigCoins payments all you have to do is carry a DNA sequencer with you, put the microscope slide on it, press a button, and BAM! 10 minutes later you have your currency validated.
To help accelerate the adoption of MigCoins, I will be offering bundles of MigCoins with the Ilumina MiSeq Personal DNA sequencer:
Some might argue that the machine alone is 125,000 dollars and validating one MigCoin is going to set me back 750 dollars.
Three words my friends: Economy of Scale.
We are going to need a few of you to put some extra pesos early on to get the prices to the DNA machines down.
Early Adopters of MigCoins
I will partner with visionaries like these to get the first few thousands sequencers built and start to get the prices down. Then we will hire that guy ex-Apple guy that was CEO of JC Penney to get his know-how on getting the prices of these puppies down.
Like Bitcoin, I expect to see a lot of nay-sayers and haters. People that will point out flaws on this system. But you know what?
The pace of innovation can not be held back by old-school economists that "don't get it" and pundits on CNN trying to make a quick buck. Hater are going to hate. 'nuff said.
Next week, I will be launching MigXchange, a place where you can trade your hard BitCoins for slabs of spit.
Join the revolution! Get your spit on!
Posted on 12 Apr 2013
We obtained some confidential information about the upcoming Facebook Phone. Here is what we know about it so far:
The FacebookPhone will be free (no contract) but will pause your call every 30 seconds to play an ad for 20— Miguel de Icaza (@migueldeicaza) March 29, 2013
Everyone will get an FacebookPhone to use as a honeypot trap for unwanted calls.— Miguel de Icaza (@migueldeicaza) March 29, 2013
@migueldeicaza it will also force a call to a person you haven't talked to in 10 years for every 5 calls you make.— Jonathan Chambers (@jon_cham) March 29, 2013
@migueldeicaza it'll use the ambient light sensor to detect when your not listening to the ad on your phone and just restart it— Alex Trott (@AlexTrott_) March 29, 2013
@migueldeicaza and ad calls cannot be muted, refused and the disconnect button will be suspended for the period of the call!...— Sumit Maitra (@sumitkm) March 29, 2013
@migueldeicaza the proximity sensor will be used to switch between earphone and speakerphone if it detects you've put the phone down!— Sumit Maitra (@sumitkm) March 29, 2013
@migueldeicaza Before you're able to call, you must finish a game of Farmville. When done, it asks "Do you wish to call *related friend*?"— Marco Kuiper (@marcofolio) March 29, 2013
The FacebookPhone will charge you 100 dollars to dial people who you have not friended— Miguel de Icaza (@migueldeicaza) March 29, 2013
@migueldeicaza After the call, a transcript and recording will be posted to your timeline, with the other party tagged.— Chris Howie (@cdhowie) March 29, 2013
@migueldeicaza it will have a single hardware button -'Like'— Martin Topping (@eMartinTopping) March 29, 2013
The FacebookPhone has no lock code, as privacy is just an illusion— Miguel de Icaza (@migueldeicaza) March 29, 2013
@migueldeicaza When you call someone, their phone will ring on the lowest volume unless you pay to "Promote" the call.— Brent Schooley (@brentschooley) March 29, 2013
@migueldeicaza It will also change the UI every week to expose features you don't use— Shmueli Englard (@Shmuelie) March 29, 2013
@migueldeicaza "The application 'My Friend Secrets' would like the following permissions: * Eavesdrop on all of your FacebookPhone calls."— Chris Howie (@cdhowie) March 29, 2013
Posted on 29 Mar 2013
While reading Dave Winer's Why Windows Lost to Mac post, I noticed many parallels with my own experience with Linux and the Mac. I will borrow the timeline from Dave's post.
I invested years of my life on the Linux desktop first as a personal passion (Gnome) and when while awoken for two Linux companies (my own, Ximian and then Novell). During this period, I believed strongly in dogfooding our own products. I believed that both me and my team had to use the software we wrote and catch bugs and errors before it reached our users. We were pretty strict about it: both from an ideological point of view, back in the days of all-software-will-be-free, and then practically - during my tamer business days. I routinely chastised fellow team members that had opted for the easy path and avoided our Linux products.
While I had Macs at Novell (to support Mono on MacOS), it would take a couple of years before I used a Mac regularly. In some vacation to Brazil around 2008 or so, I decided to only take the Mac for the trip and learn to live with the OS as a user, not just as a developer.
Computing-wise that three week vacation turned out to be very relaxing. Machine would suspend and resume without problem, WiFi just worked, audio did not stop working, I spent three weeks without having to recompile the kernel to adjust this or that, nor fighting the video drivers, or deal with the bizarre and random speed degradation that my ThinkPad suffered.
While I missed the comprehensive Linux toolchain and userland, I did not miss having to chase the proper package for my current version of Linux, or beg someone to package something. Binaries just worked.
From this point on, using the Mac was a part-time gig for me. During the Novell layoffs, I returned my laptop to Novell and I was left with only one Linux desktop computer at home. I purchased a Mac laptop and while I fully intended to keep using Linux, the dogfooding driver was no longer there.
Dave Winer writes, regarding Windows:
Back to 2005, the first thing I noticed about the white Mac laptop, that aside from being a really nice computer, there was no malware. In 2005, Windows was a horror. Once a virus got on your machine, that was pretty much it. And Microsoft wasn't doing much to stop the infestation. For a long time they didn't even see it as their problem. In retrospect, it was the computer equivalent of Three Mile Island or Chernobyl.
To me, the fragmentation of Linux as a platform, the multiple incompatible distros, and the incompatibilities across versions of the same distro were my Three Mile Island/Chernobyl.
Without noticing, I stopped turning on the screen for my Linux machine during 2012. By the time I moved to a new apartment in October of 2012, I did not even bother plugging the machine back and to this date, I have yet to turn it on.
Even during all of my dogfooding and Linux advocacy days, whenever I had to recommend a computer to a single new user, I recommended a Mac. And whenever I gave away computer gifts to friends and family, it was always a Mac. Linux just never managed to cross the desktop chasm.
Posted on 05 Mar 2013
We spent a year designing the new UI and features of Xamarin Studio (previously known as MonoDevelop).
I shared some stories of the process on the Xamarin blog.
After our launch, we open sourced all of the work that we did, as well as our new Gtk+ engine for OSX. Lanedo helps us tremendously making Gtk+ 2.x both solid and amazing on OSX (down to the new Lion scrollbars!). All of their work has either been upstreamed to Gtk+ or in the process of being upstreamed.
Posted on 22 Feb 2013
"Reality Distortion Field" is a modern day cop out. A tool used by men that lack the intellectual curiosity to explain the world, and can deploy at will to explain excitement or success in the market place. Invoking this magical super power saves the writer from doing actual work and research. It is a con perpetuated against the readers.
The expression originated as an observation made by those that worked with Steve to describe his convincing passion. It was insider joke/expression which has now been hijacked by sloppy journalists when any subject is over their head.
The official Steve Jobs biography left much to be desired. Here a journalist was given unprecedented access to Steve Jobs and get answers to thousands of questions that we have to this day. How did he approach problems? Did he have a method? How did he really work with his team? How did he turn his passion for design into products? How did he make strategic decisions about the future of Apple? How did the man balance engineering and marketing problems?
The biography has some interesting anecdotes, but fails to answer any of these questions. The biographer was not really interested in understanding or explaining Steve Jobs. He collected a bunch of anecdotes, stringed them together in chronological order, had the text edited and cashed out.
Whenever the story gets close to an interesting historical event, or starts exploring a big unknown of Steve's work, we are condescendingly told that "Steve Activated the Reality Distortion Field".
Every. Single. Time.
Not once did the biographer try to uncover what made people listen to Steve. Not once did he try to understand the world in which Steve operated. The breakthroughs of his work are described with the same passion as a Reuters news feed: an enumeration of his achievements glued with anecdotes to glue the thing together.
Consider the iPhone: I would have loved to know how the iPhone project was conceived. What internal process took place that allowed Apple to gain the confidence to become a phone manufacturer. There is a fascinating story of the people that made this happen, millions of details of how this project was evaluated and what the vision for the project was down to every small detail that Steve cared about.
Instead of learning about the amazing hardware and software engineering challenges that Steve faced, we are told over and over that all Steve had to do was activate his special super power.
The biography in short, is a huge missed opportunity. Unprecedented access to a man that reshaped entire industries and all we got was some gossip.
The "Reality Distortion Field" is not really a Steve Jobs super-power, it is a special super power that the technical press uses every time they are too lazy to do research.
Why do expensive and slow user surveys, or purchase expensive research from analysts to explain why some product is doing well, or why people are buying it when you can just slap a "they activated the Reality Distortion Field and sales went through the roof" statement in your article.
As of today, a Google News search for "Reality Distortion Field Apple" reports 532 results for the last month.
Perhaps this is just how the tech press must operate nowadays. There is just no time to do research as new products are being unveiled around the clock, and you need to deliver opinions and analysis on a daily basis.
But as readers, we deserve better. We should reject these explanations for what they are: a cheap grifter trick.
Posted on 07 Nov 2012
After a year and a half, we have finally released Mono 3.0.
Like I discussed last year, we will be moving to a more nimble release process with Mono 3.0. We are trying to reduce our inventory of pending work and get new features to everyone faster. This means that our "master" branch will remain stable from now on, and that large projects will instead be developed in branches that are regularly landed into our master branch.
Check our release notes for the full details of this release. But here are some tasty bits:
Also, expect F# 3.0 to be bundled in our OSX distribution.
Posted on 22 Oct 2012
Let me share with you some links that I found interesting in the past few weeks. These should keep the most diligent person busy for a few hours.
Talbot Crowell's Introduction to F# 3.0 slides from Boston CodeCamp.
Bertrand Meyer (The creator of Eiffel, father of good taste in engineering practices) writes Fundamental Duality of Software Engineering: on the specifications and tests. This is one of those essays where every idea is beautifully presented. A must read.
MonkeySpace slide deck on MonoGame.
Oak: Frictionless development for ASP.NET MVC.
Simon Peyton Jones on video talks about Haskell, past, present and future. A very tasty introductory talk to the language. David Siegel says about this:
Simon Peyton-Jones is the most eloquent speaker on programming languages. Brilliant, funny, humble, adorable.
Rob Pike's talk on Concurrency is not Parallelism. Rob is one of the crisper minds in software development, anything he writes, you must read, everything he says, you must listen to.
SparkleShare, the open source file syncing service running on top of Git released their feature-complete product. They are preparing for their 1.0 release. SparkleShare runs on Linux, Mac and Windows. Check out their Release Notes.
Experts warn that Canonical might likely distribute a patched version that modifies your documents and spreadhseets to include ads and Amazon referal links.
Better debugging tools for Google Native Client.
Touch Draw comes to MacOS, great vector drawing application for OSX. Good companion to Pixelmator and great for maintaining iOS artwork. It has great support for structured graphics and for importing/exporting Visio files.
MonoGame 3D on the Raspberry Pi video.
Fruit Rocks a fun little game for iOS.
@Redth, the one man factory of cool hacks has released:
Phil Haack blogs about MonkeySpace
Kicking the Twitter habit.
Twitter Q&A with TJ Fixman, writer for Insomniac Games.
Debunking the myths of budget deficits: Children and Grandchildren do not pay for budget deficits, they get interest on the bonds.
Updated Programming F# 3.0, 2nd Edition is out. By Chris Smith, a delightful book on F# has been updated to cover the new and amazing type providers in F#.
ServiceStack now has 113 contributors.
From Apple Insider: Google may settle mobile FRAND patent antitrust claim.
The Salt Lake City Tribune editorial board endorses Obama over Romney:
In considering which candidate to endorse, The Salt Lake Tribune editorial board had hoped that Romney would exhibit the same talents for organization, pragmatic problem solving and inspired leadership that he displayed here more than a decade ago. Instead, we have watched him morph into a friend of the far right, then tack toward the center with breathtaking aplomb. Through a pair of presidential debates, Romney’s domestic agenda remains bereft of detail and worthy of mistrust.
Therefore, our endorsement must go to the incumbent, a competent leader who, against tough odds, has guided the country through catastrophe and set a course that, while rocky, is pointing toward a brighter day. The president has earned a second term. Romney, in whatever guise, does not deserve a first.
From Blue States are from Scandinavia, Red States are from Guatemala the author looks at the differences in policies in red vs blue states, and concludes:
Advocates for the red-state approach to government invoke lofty principles: By resisting federal programs and defying federal laws, they say, they are standing up for liberty. These were the same arguments that the original red-staters made in the 1800s, before the Civil War, and in the 1900s, before the Civil Rights movement. Now, as then, the liberty the red states seek is the liberty to let a whole class of citizens suffer. That’s not something the rest of us should tolerate. This country has room for different approaches to policy. It doesn’t have room for different standards of human decency.
Esquire's take on the 2nd Presidential Debate.
Dave Winer wrote Readings from News Execs:
There was an interesting juxtaposition. Rupert Murdoch giving a mercifully short speech saying the biggest mistake someone in the news business could make is thinking the reader is stupid. He could easily have been introducing the next speaker, Bill Keller of the NY Times, who clearly thinks almost everyone who doesn't work at the NY Times is stupid.
What do you know, turns out that Bill Moyers is not funded by the government nor does he get tax money, like many Republicans like people to believe. The correction is here.
"Non-Alcoholic Sparkling Beverage" - Whole Foods' $7.99 name for "bottle of soda".
Problem with most religious people is that their faith tells them to play excellently in game of life, but they want to be the referees.
Hylke Bons on software engineering:
"on average, there's one bug for every 100 lines of code" this is why i put everything on one line
If government doesn't create jobs, isn't Romney admitting that his campaign is pointless?
OH "It is a very solid grey area" #sc34 #ooxml
"I don't care how many thousand words your blog post is, the words 'SYMBIAN WAS WINNING' mean you're too high on meth to listen too.
Jeremy Scahill on war monger Max Boots asks the questions
Do they make a Kevlar pencil protector? Asking for a think tanker.
Max Boot earned a Purple Heart (shaped ink stain on his shirt) during the Weekly Standard War in 1994.
"W3C teams with Apple, Google, Mozilla on WebPlatform"... or we could all just sponsor a tag on StackOverflow.
Most programmers who claim that types "get in the way" had a sucky experience with Java 12 years ago, tried Python, then threw the baby out.
How Hollywood Studios employ creative accounting to avoid sharing the profits with the participants. If you were looking at ways to scam your employees and partners, look no further.
Startvation in Gaza: State forced to release 'red lines' document for food consumption.
Dirty tricks and disturbing trends: Billionaire warn employees that if Obama is reelected, they will be facing layoffs.
Here we are today, three months later, and within the last month alone, these two parents lost two children, and the two remaining ones are sick as well. Sunday is already in hospital with malaria, in serious condition, and Mahm is sick at home. “I’ve only two children left,” Michael told me today over the phone. The family doesn’t have money to properly treat their remaining children. The hospitals are at full capacity and more people leave them in shrouds than on their own two feet. I ask you, beg of you to help me scream the story of these children and their fate, dictated by the heartless, immoral Israeli government.
When Suicide is Cheaper the horrifying tales of Americans that can not afford health care.
Paul Ryan is not that different from Todd Akin, when it comes to women rights.
A Windows 8 critique: someone is not very happy with it.
On "meritocracy": what is wrong with it.
Fascinating read on the fast moving space of companies: Intimate Portrait of Innovation, Risk and Failure Through Hipstamatic's Lens.
Kathy Sierra discusses sexism in the tech world. How she changed her mind about it, and the story that prevented her from seeing it.
Response @antirez's sexist piece.
Sony's Steep Learning Process a look at the changing game with a focus on Sony's challenges.
Cat fact and Gif provides Cat facts with a gif that goes with it. The ultimate resource of cat facts and gifs.
Posted on 21 Oct 2012
The gnome-config script was a precursor to pkg-config, they are tools used that you can run and use to extract information about the flags needed to compile some code, link some code, or check for a version. gnome-config itself was a pluggable version of Tcl's tclConfig.sh script.
The idea is simple: pkg-config is a tiny little tool that uses a system database of packages to provide version checking and build information to developers. Said database is merely a well-known directory in the system containing files with the extension ".pc", one per file.
These scripts are designed to be used in shell scripts to probe if a particular software package has been installed, for example, the following shell script probes whether Mono is installed in the system:
# shell script if pkg-config --exists mono; then echo "Found Mono!" else echo "You can download mono from www.go.mono.com" fi
It can also be used in simple makefiles to avoid hardcoding the paths to your software. The following makefile shows how:
CFLAGS = `pkg-config --cflags mono` LIBS = `pkg-config --libs mono` myhost: myhost.c
And if you are using Automake and Autoconf to probe for the existence of a module with a specific version and extract the flags needed to build a module, you would do it like this:
AC_SUBST(MONO_FLAGS) AC_SUBST(MONO_LIBS) if pkg-config --atleast-version=2.10 mono; then MONO_FLAGS=`pkg-config --cflags mono` MONO_LIBS=`pkg-config --libs mono` else AC_MSG_ERROR("You need at least Mono 2.10") fi
There are two main use cases for pkg-config.
Probing: You use the tool to pobe for some condition about a package and taking an action based on this. For this, you use the pkg-config exit code in your scripts to determine whether the condition was met. This is what both the sample automake and the first script show.
Compile Information: You invoke the tool which outputs to standard output the results. To store the result or pass the values, you use the shell backtick (`). That is all there is to it (example: version=`pkg-config --version`).
The tool is so immensely simple that anyone can learn every command that matters in less than 5 minutes. The whole thing is beautiful because of its simplicity.
Perhaps it was a cultural phenomenon, perhaps someone that had nothing better to do, perhaps someone that was just trying to be thorough introduced one of the most poisoning memes into the pool of ideas around pkg-config.
Whoever did this, thought that the "if" statement in shell was a complex command to master or that someone might not be able to find the backtick on their keyboards.
And they hit us, and they hit us hard.
They introduced pkg.m4, a macro intended to be used with autoconf, that would allow you to replace the handful of command line flags to pkg-config with one of their macros (PKG_CHECK_MODULES, PKG_CHECK_EXISTS). To do this, they wrote a 200 line script, which replaces one line of shell code with almost a hundred. Here is a handy comparison of what these offer:
# Shell style AC_SUBST(MONO_LIBS) AC_SUBST(MONO_CFLAGS) if pkg-config --atleast-version=2.10 mono; then MONO_CFLAGS=`pkg-config --cflags mono` MONO_LIBS=`pkg-config --libs mono` else AC_MSG_ERROR(Get your mono from www.go-mono.com) fi # # With the idiotic macros # PKG_CHECK_MODULES([MONO], [mono >= 2.10],, [ AC_MSG_ERROR(Get your mono from www.go-mono.com) ]) # # If you do not need split flags, shell becomes shorter # if pkg-config --atleast-version=2.10 mono; then CFLAGS="$CFLAGS `pkg-config --cflags mono`" LIBS="$LIBS `pkg-config --libs mono`" else AC_MSG_ERROR(Get your mono from www.go-mono.com) fi
The above shows the full benefit of using a macro, MONO is a prefix that will have LIBS and CFLAGS extracted. So the shell script looses. The reality is that the macros only give you access to a subset of the functionality of pkg-config (no support for splitting -L and -l arguments, querying provider-specific variable names or performing macro expansion).
Most projects, adopted the macros because they copy/pasted the recipe from somewhere else, and thought this was the right way of doing things.
The hidden price is that saving that few lines of code actually inflicts a world of pain on your users. You will probably see this in your forums in the form of:
Subject: Compilation error I am trying to build your software, but when I run autogen.sh, I get the following error: checking whether make sets $(MAKE)... yes checking for pkg-config... /usr/bin/pkg-config ./configure: line 1758: syntax error near unexpected token FOO,' ./configure: line 1758:PKG_CHECK_MODULES(FOO, foo >= 2.9)'
And then you will engage in a discussion that in the best case scenario helps the user correctly configure his ACLOCAL_FLAGS, create his own "setup" script that will properly configure his system, and your new users will learn the difference between running a shell script and "sourcing" a shell script to properly setup his development system.
In the worst case scenario, the discussion will devolve into how stupid your user is for not knowing how to use a computer and how he should be shot in the head and taken out to the desert for his corpse to be eaten by vultures; because, god dammit, they should have googled that on their own, and they should have never in the first place have installed two separate automake installations in two prefixes, without properly updating their ACLOCAL_FLAGS or figured out on their own that their paths were wrong in the first place. Seriously, what moron in this day and age is not familiar with the limitations of aclocal and the best practices to use system-wide m4 macros?
Hours are spent on these discussions every year. Potential contributors to your project are driven away, countless hours that could have gone into fixing bugs and producing code are wasted, you users are frustrated. And you saved 4 lines of code.
The pkg.m4 is a poison that is holding us back.
We need to end this reign of terror.
Send pull requests to eliminate that turd, and ridicule anyone that suggests that there are good reasons to use it. In the war for good taste, it is ok to vilify and scourge anyone that defends pkg.m4.
Posted on 20 Oct 2012