Categories
Hardware Music Technology

M-Audio Xponent Woe – an update

You may have read my generally glowing review of the M-Audio Xponent. I’ve been meaning to update you all with some important news on that front. After I wrote the review, my Xponent developed a fault. I’d had it less than a month, and it had had only fairly light use, maybe 8-10 hours total.

One of the audio channels started cutting out (on both Main and Booth out). Started as just a bit of distortion, then it was really quiet and badly distorted, then it cut out completely. The headphone channel wasn’t affected.

I surmised that this was a bad electrical connection. To test this theory, I did what any self-respecting techie would do: gave it a whack (well, a gentle tap on the side, and then lifting the right-hand side of the unit by about an inch and letting it drop). That did the trick: the sound cut back in, diagnosis confirmed.

Obviously I was mortified that the unit should have such a trivial manufacturing fault (it wasn’t the only one either… I’d already started to uncover some much more subtle and minor problems, like one of the pots being centred at controller value 66 rather than 64). So it was sent back to DV. A month later, repairs were still not done and there was no ETA, so after some argument and quoting of the Sale of Goods Act, DV graciously agreed to a refund.

All of that prompted a reassessment of what I was aiming for with DJing, and whether computer-based mixing would really work for me even if I got a fully working Xponent. Was I happy to be staring at a monitor to mix? No, I do that all day for my day job. Could I imagine taking a laptop and console out to a club every time I play out? No, I’d just worry about it getting nicked, broken, and the hassle of setting it up. Was I content with the quality? Sort of, but in my heart I knew it wasn’t ever going to be as good as a pro quality mixer. If I’m serious about DJing, I might as well do it on the equipment that is already there in every club the length and breadth of the land.

So I decided to invest in Pioneer CDJ1000s and a new mixer after all. That has definitely turned out to be the right choice for me. I’m having a lot more fun now than I was doing it on the computer, and getting professional-quality results that I don’t think I’d have got from the Xponent. All at much higher cost, of course, so it comes down to considering it as an investment rather than an expense. I’m glad I tried out the computer mixing option first, and did it with a console that (manufacturing defects aside), can seriously claim to be the best or one of the best out there. That left me in no doubt that I needed to pursue a different approach, rather than just a different console.

Categories
GUI/X11/Xfree86/Xorg Hardware Linux OS/Software Technology

M-Audio Xponent + Linux + mixxx

Updated: 2007/07/27 – New section on LEDs. Update on mixxx SVN.

In case any of you who have read my review of the M-Audio Xponent are thinking of getting one and wondering “but will it work under Linux?”, the short answer is a resounding Yes!… except for the LEDs, so far. More on that later.

I use Debian unstable with a hand-rolled kernel. YMMV, of course, but the chances are that if you’re using any modern distro you’ll be fine. In fact you may not even have to manually insert the kernel modules if you have a working udev setup, you might just be able to plug’n’play.

Full details below the cut, as they say…

Categories
Hardware Music Product Reviews Technology

M-Audio Xponent review

M-Audio Xponent

Update: After you’ve read this article, before you rush out and buy one of these, you should read this update (don’t worry, I’ll link to it at the bottom of the page too.

It’s rare for me to suffer from “gear lust” but the M-Audio Xponent set my pulse racing when I discovered it on the web. I’ve now had it for a few days so here’s my review. I won’t be reviewing the Torq software: I haven’t used it, as I only have Linux at the moment.
There’s a separate article about my experience getting the Xponent working with Linux and mixxx.

Review follows:

Categories
PHP Programming Languages Technology

import_request_variables(): When will PHP stop being insecure by design?

Re Bugtraq post PHP import_request_variables() arbitrary variable overwrite.

This sort of thing really brings it home how the PHP core team still
don’t seem to really understand security… or would rather sacrifice it
in the name of backwards, very backwards, compatibility.

If you’re going to provide a function like import_request_variables()
to replace the blatantly-unsafe register_globals, how on earth can you
get it so badly wrong that it’s even more unsafe?? I mean, security 101
for a function like this would be:

  1. Don’t overwrite any existing global variables (come on!)
  2. Failing to specify a prefix should be at least an E_WARNING.
  3. Stop pandering to people who wrote or still use bad code originating back in PHP3 days, and aim towards getting rid of the whole concept of registering global variable symbols from user-supplied data. It was always a bad idea, it’s never going to stop being a bad idea, just drop it.

I also get terribly bored of seeing bugtraq reports of a “full path disclosure” bug in some app (like it’s a big deal too), only to find that it is, once again, PHP itself that’s at fault. Sure, the sysadmin on a production machine should set display_errors = Off, but if it’s left on why does PHP show the full path? If the purpose is to help the programmer developing code (who somehow doesn’t have access to the server errorlog), then the programmer doesn’t need the full path, they already know where the docroot is — so when showing errors from scripts in the docroot, why not only show the path relative to the docroot? And if from an included or required file outside the docroot, just the last directory component ought to suffice. The full path can still go in the error log, and then maybe developers would learn to use it instead of relying on errors going to screen…

PHP can be secure, but it really needs to stop offering features that are insecure by definition.

Categories
GUI/X11/Xfree86/Xorg OS/Software

Zen console messages

It’s a shame that console messages from GUI apps usually go unread. I just discovered the following profound koan in an xterm from which I’d run Firefox:


Warning: more than one line!
Warning: more than one line!
Warning: more than one line!
This should only happen once
Warning: Attempt to remove nonexistent passive grab

Categories
Linux OS/Software

Backuppc woe

I’ve been using BackupPC to take offsite backups of all my machines over the network for over a year. It seemed to work well enough and, it seemed, would always email me if it hadn’t been able to backup a certain machine for a few days.

Yesterday I discovered that it has not done a successful backup of one of my machines since March! I just suddenly noticed on the status screen that instead of a table of 8 backups (2 full and 6 incr), only 3 were shown — 2 full, both dating back to March, and 1 “partial” from the day before yesterday. Looking at the logs I see this:


2006-07-24 06:00:05 full backup started for directory /data/work; updating partial 678
2006-07-24 06:20:28 full backup started for directory /home; updating partial 678
2006-07-24 06:20:34 Got fatal error during xfer (fileListReceive failed)
2006-07-24 06:20:39 Backup aborted (fileListReceive failed)
2006-07-24 06:20:39 Saved partial dump 678

Exactly the same thing has been happening every day for the past 4 months. Backuppc didn’t email to tell me. It’s email system was definitely working because during that time it did mail me about a machine that was offline for a while. So it appears it doesn’t bother to send mail to notify you of a failed backup!

I had no idea what might be causing this. It just started out of the blue, having worked flawlessly before March. It only affected one machine. The configuration had not changed. It always failed on /home but was apparently ok with /data/work.

Something weird in /home? To find out, I set tar loose on it:

$ cd /home; tar cf - . >/dev/null
tar: ./jammin/.gxine/socket: socket ignored
tar: ./jammin/.kde/kdeinit-\:0: socket ignored
tar: ./sarah/.totem.sarah: socket ignored
tar: ./sarah/.xine/session.0: socket ignored
$

Surely not. Surely it couldn’t be something as trivial as a couple of stale socket files causing my backup to fail? Well, I’m not using any of those programs, so I deleted the sockets, and told backuppc to start a full backup. What do you know — it worked.

So is it that it doesn’t like sockets? Or has the poor thing got confused by the funny characters in the filename of that KDE one? I’ll test this out at a later date when my backups have recovered.

There are three major failings by BackupPC here. One, failing over a simple socket or dodgy filename, and not giving much clue why. The second, not bothering to email when a backup fails halfway through. But most concerning of all is that it kept trying to add to the same partial backup, instead of starting a new one — so I no longer have 2 weeks’ worth of incrementals even for the part of the backup that succeeded. Every day, yesterday’s backup was being overwritten by today’s. If I needed to recover a version of a file in /data/work from 2 days ago, I couldn’t. That sucks.

This has made me realise something. Quis custodiet ipsos custodes?. Why am I relying on *one* backup solution? It’s a SPOF, and it has quite spectacularly F’d. I still want to find out why, and ideally fix it, but I’m also going to start setting up something else alongside. Since backuppc is server-driven, the alternative should be client-driven. All recommendations welcome. The two major requirements are that it must support ssh, and be bandwidth-efficient because I’m backing up over ADSL.
All the client machines run Linux.