Two stories from recent years got me thinking. (If you know the stories, you’re allowed to skip to the last paragraph.)
1. A British father helped his wife give birth at home. He’s not the first, and won’t be the last, but there’s a lot that can go wrong. You really want to get it right, and if you don’t have a midwife or doctor handy, and you (and the woman giving birth) never happened to learn how to deliver a baby, what do you do? Leroy Smith turned to the web, via his mobile phone. He found a wikiHow article, and by following the 10 steps, he did his part well.
2. Surgeon David Nott had a more complex challenge. A hippo had bitten off a boy’s arm, and faced death within days from infection. An amputation of his shoulder blade and collar bone would save him – but the doctor didn’t have any experience of this unusual and complex procedure, and no one he knew in the Democratic Republic of Congo could help. But a colleague in the UK could help – and did so via SMS. In two very long text messages he explained the procedure, and wished Dr Nott luck. The operation – carried out in a basic operating theater, without the equipment and support the doctor would have expected back home in the UK – was a success, and the boy’s life was saved.
In the appropriate technology for solving a problem, the key component is often information. Whether we’re talking about health services or development, the right information can be the difference between a good outcome and a failure.
I’m inspired to see wikiHow used in this way – as I am with the stories I hear of Appropedia being used in the field. It’s also true that making the best use of expert knowledge, as Dr Nott was able to do, supports good outcomes. Combining these ideas – enhancing ways of accessing knowledge, and making available the best knowledge – continue to guide our mission.
A fellow Appropedian asked me about options for lightweight Linux distros, for using on old hardware. Thought I’d share my response here.
My knowledge is limited, but what I’ve learnt:
- Join a local LUG – look out for days when they help people install Linux. Beware of installing Linux when you won’t be face-to-face with Linux geeks for a long time, especially if you’re doing something more problematic like installing on a laptop – I made this mistake, and it was a horrible time sink. Online support doesn’t cut it.
- Vector & other Slackware distros don’t seem user friendly, and neither does DSL (Damn Small Linux) – I looked into it, but with only about 2 years experience in Linux, I didn’t feel up to any of these choices. With more experience, and the backing of geek friends, it may be an option for you. (DSL is also a much older distro, with much older packages a.k.a. program versions, but it works on very limited hardware, and is possibly more reliable than other ultralight distros such as Puppy Linux).
- I recommend Openbox (window manager) and LXDE (desktop environment using Openbox – meaning Openbox is the lighter of these two light options). These are really nice and lean – lighter than XFCE, but nicer to use. Expect to see these become more popular. You can add them to any distro, but where they’re not one of the standard options, in some cases there can be clashes (probably a bigger problem on a laptop).
- I like to find a distro where it’s set up to be lean, but it’s easy to use.
- I’m not hung up on installing “free” (open source) only – I want Skype and I want video codecs. (I install Linux firstly because I want an operating system that does what I need, not to make a statement.) Ubuntu makes for a little hassle with this – you have to add repositories and certain packages (programs and codecs), and the new user doesn’t know this – they just wonder why things don’t work. Debian makes it really hard work for a newbie, especially if any of your hardware doesn’t have a perfectly free (open source) driver.
- I strongly prefer something that is at least based on a major distro, and uses the package repositories of that distro. There’s the potential for better support and in theory for bug fixing (Ubuntu is buggy anyway, in my experience, but it does have good support). It also means far more software choice. This, with the previous points, leaves me with one distro:
- CrunchBang Linux: it’s based on Ubuntu, but uses Openbox, but with some very cool usability tweaks, including partial use of LXDE. It also comes with Skype and video codecs installed. This is the only distro I know that comes with Openbox by default (excluding Debian and Knoppix which I don’t recommend – see below). I’m not usually a fan of Ubuntu, for several reasons including bugginess when I used it in the past – but in spite of that, it’s working quite well for me at the moment, and it has an active and helpful community. This is the most promising distro I’ve used.
- Debian 5.0 comes with with LXDE as one of its standard options, which means it has Openbox – but Debian was unnecessarily difficult for me. When it didn’t even recognize the hard disk on my ThinkPad, I thought: if this is a sign of how things work in Debian, I’m trying something else.
- And Knoppix also comes with LXDE standard. It’s not designed for installation to hard disk though, unless you really know Linux. However, it’s apparently a great rescue disk, with a reputation for hardware recognition – the MacGyver of Linux distros – so I keep a Knoppix LiveCD handy, just in case. (I’d try the CrunchBang LiveCD first, but if things are really screwed up and that doesn’t work, I’ll try Knoppix.)
- I’ve heard good things about Puppy Linux – it was flaky when I tried it ~2006, but may have improved. It’s also kind of a backwater in Linux development – a lot of non-standard stuff, running as root by default (which sounds like a bad idea to me and to many Linux people), with its own kind of installation, and far fewer packages than a major distro. So unless you need to go super-light (even lighter than Crunchbang) I wouldn’t recommend it.
- I just discovered boxpup – looks like Puppy with Openbox. I’m guessing it’s a bit harder than CrunchBang, with less package choices, but probably even lighter than CrunchBang. I would still have some concerns about bugginess, security, package choice and maybe usability, but if you’re keen, you could try it out with some help from your LUG.
- Anything I’ve said related to something being hard to use (e.g. Debian) becomes much less of an issue if you have geeky friends close by and/or belong to a LUG. My preference though: Get something you can mostly handle yourself. You’ll still need help, but there’s no need to make it harder than necessary.
So join a LUG, check out CrunchBang, and enjoy Linux!
Thanks to Jon Camfield for his input at the talk page where this started.
Innovative uses of the internet in education, in Mexico and India:
Universidad de la Tierra – Mexico
Located in the southern Mexican city of Oaxaca de Juarez, the Universidad de la Tierra is an alternative learning initiative through which students learn from the world by doing. This process happens largely in communication with others, in the form of study/reading circles (“communities of practice”) and intercultural exchange. The NewWorkSpaces online community tool (Unitierra’s space here) enables learners to access “collaborative technology that will help [them]…convene conversations, co-create and publish documents, invite others into…learning experiences, and exchange…knowledge and resources.” Other means of sharing learning experiences include libraries, documentation centres, community radio, media campaigns, and publishing. These modes also provide dialogue opportunities around Unitierra activities such as those with indigenous communities engaged in cultural regeneration, technological and socio-political innovation, and social struggle (e.g. through workshops, videos, the creation of ecological dry toilets and solar arrays, organic agriculture, and alternative media). More.
Samvidha – India
This project is a response to the need to make relevant internet-based information accessible to all of India’s teachers and students at a low cost. Carried out by the non-profit Media Lab Asia in collaboration with Indian Institute of Technology (IIT) Kharagpur, the Samvidha project is an effort to bridge the digital divide by providing off-line access to curriculum-related internet content using a query-based system. Individual variations among the different students can be captured by their user profile, which includes each student’s individual interests and capabilities. This idea of offering personalised content access and presentation is also reflected in the fact that navigation interfaces are offered in Bengali, Hindi, and English. Content which is appropriate for the user’s needs is then emailed to the user in the school; information located on the internet is provided to the user in English or, where available, in a given Indian language. More. See also the Samvidha page on Media Lab Asia’s website.
Thanks to The Communication Initiative Network for this news – “Where communication and media are central to social and economic development”.
Relevant Appropedia wiki pages:
I sent this to friends in Taiwan, but also want to share it more widely:
You may know already, but in Taiwan on Dec 10 is BarCampTaipei. Joy Tang will be there, talking about wifi (& Linux) for African villages.
Joy is part of the LXDE team – LXDE is an excellent light Linux desktop, made by a Taiwanese hacker, “PCMan” a.k.a. Hong Jen Yee. It works very well with existing Linux distros, and I think is a great step forward. I am supporting LXDE, e.g. helping with documentation, as I think it has great potential to make Linux more usable and make computers more accessible in poorer countries. PCMan won’t be at the BarCamp, but a few of the LXDE team will be.
So, I wanted to let you know, to be aware of this great development in Linux that comes from Taiwan, and to get in touch with each other, if you’re interested.
Btw, I really liked Taiwan, and I hope to visit next year, maybe in the middle of the year or earlier. Hope to catch you then!
Efficient code is green code, code that will work better on old or “light” computers used in developing countries, better on the shiny new netbooks (such as the EEE) that are coming out these days – and that will make a fast computer even faster. Efficient code, it seems, has no downside.
Jim Gettys of OLPC says in a July 2006 interview:
There seems to be a common fallacy among programmers that using memory is good: on current hardware it is often much faster to recompute values than to have to reference memory to get a precomputed value. A full cache miss can be hundreds of cycles, and hundreds of times the power consumption of an instruction that hits in the first level cache. Making things smaller almost always makes them faster (and lower power). Similarly, it can be much faster to redraw an area of the screen than to copy a saved image from RAM to a screen buffer. Many programmer’s presumptions are now completely incorrect and we need to reeducate ourselves…
A large part of this task is raising people’s consciousness that we’ve become very sloppy on memory usage, and often there is low hanging fruit making things use less memory (and execute faster and use less power as a result). Sometimes it is poor design of memory usage, and sometimes it is out and out bugs leaking memory. On our class of a system, leaks are of really serious concern: we don’t want to be paging to our limited size flash.
In fact, much of the performance unpredictability of today’s free desktop can be attributed to the fact that several of our major applications are wasting/leaking memory and driving even systems with half a gigabyte of memory or more to paging quite quickly…
X [the X window manager] does what its told: many applications seem to think that storing pixmaps in the X server (and often forgetting about them entirely) is a good strategy, whereas retransmitting or repainting the pixmap may be both faster and use less memory. Once in a while there is a memory leak in X (generally in the graphics drivers): but almost always the problem are leaks in applications, which often forget the pixmaps they were using.RAM in the X server is just as much RAM of your program, though it is in a different address space. People forget that the X Window System was developed on systems with 2 meg of RAM, and works today on 16 megabyte iPAQ handhelds.
We need better tools; some are beginning to appear. OLPC is sponsoring a Google Summer of Code student, Eduardo Silva, from Chile, who is working on a new tool called Memphis to help with this problem.
Work done on memory consumption will benefit everyone: not everyone in the world has a 2ghz laptop with a gig or two of RAM…
Confession: I’m not a coder. I help with the development of Linux only by documenting the parts I know, and by reporting bugs. While I join Jim Getty in calling for more efficient code, even much of the bloated code still represents an enormous amount of good work – it just needs some cleaning up to become awesome code.
A question at BarCampAfrica: What use is a wiki, for the poor who have no internet?
- First you need to develop the information the resource. But over time I’m sure the Appropedia community will put more and more effort into dissemination.
- There are all kinds of ways of distributing offline content – in a computer (e.g. OLPC bundles), CD-ROM flash drives, hard drives, printouts (leaflets, booklets or books*), education programs based on content developed on the wiki.
- Phones. A story was told at BarCampAfrica of a conversation in Africa. “Have you heard of Google?” “Yes, of course.” “Have you searched Google from a mobile phone?” “Of course – how else can you search with Google?” You only need one phone in the village with this capability to massively increase people’s ability to find information.
- Villagers who have moved to the city to work, that maintain a connection to the village – if they have internet access, they can send or take the information back to the village.
- That other way – the one none of us have thought of yet.
There’s no need to put weighting on the different channels. You might think #4 won’t be effective, for example. You may be right. For now, the important part is #1: Create the resource.
* This is one reason that it’s so important to use an open license that allows commercial use, so people can be motivated distribute this knowledge.
BarCampAfrica – The OLPC (laptop) project is another form of harmful subsidy, says one critic. It was a gentle critique – even the critic is a fan of the OLPC project in many ways (as am I – extremely cool tech and great educational ideas).
But it’s clear to anyone familiar with development issues that subsidies really are harmful, much of the time – and the speaker had examples of his own. Like the big headaches for ISPs in Africa when international aid organizations come in and dropping free connections on schools or communities. Such subsidies take out a whole chunk of the market that businesses no longer have access to – then when the aid organization leaves and goes somewhere else, the locals are left with local businesses that are weakened and less able to serve the community.
Now, I still see the OLPC as doing much more good than harm. Sure, they’re taking out a huge chunk of the market… but that market mostly didn’t exist before OLPC’s innovations made it possible to serve these people.
So, I like the suggestions: Open source the design*, let anyone build them, and keep the margin local.
On the other hand, I wonder if there is any possibility of a market-based solution that achieves OLPC’s aims, especially saturation. But if a no-subsidy model leads to more effective markets and institutions, then that may be a more important achievement. It also leaves space for more innovation – e.g. variations on the Educational Television Computer (a.k.a. the $10 computer).
* Actually, isn’t it already open source…? Help me out here...
As with all posts in this blog, the views expressed here are those of the poster, and don’t necessarily represent the Appropedia community.