Catégorie : Informatique

Représente un document technique lié à l’informatique. Cela inclut des analyses techniques ou des récits d’installation ou de configuration particulièrement (et parfois inutilement) problématiques.

  • Crypto-monnaie: un casse-tête qui en vaut la peine ou pas?

    En fin mars 2022, j’ai commencé à m’intéresser à la crypto-monnaie dans le but de diversifier mon portefeuilles et avoir quelque chose de prêt dans le cas improbable je l’espère où le dollar canadien devait chuter de façon catastrophique dans quelques années, à cause de toutes ces crises qui se succèdent et n’en finissent plus. Je n’avais pas idée du panier de crabes que j’allais ouvrir en explorant ça, et je ne suis pas encore certain que tout ceci en vaille la peine, mais bon, ça a été fait.

    La crypto-monnaie est en quelque sorte en marge de l’économie classique. On ne peut en retirer à un guichet automatique, ni s’en servir pour payer à l’épicerie ou à la pharmacie. Il faut l’obtenir auprès d’un site d’échange, qui permet de convertir entre dollars classiques et crypto. On peut ensuite transférer cette crypto-monnaie vers un portefeuilles personnel dont la clé privée est sous notre contrôle. La crypto-monnaie peut être transférée entre détenteurs de crypto-monnaies, ou vers un site d’échange pour être reconvertie en monnaie classique.

    Si vous souhaitez savoir pourquoi obtenir de la crypto-monnaie a été un casse-tête pour moi, lisez ce qui suit.

    Qu’est-ce que la crypto-monnaie

    Avant de me lancer, j’ai effectué quelques recherches. J’ai obtenu mes premières informations depuis Getting started with Bitcoin. Ce site explique en quoi consiste les Bitcoin et les différentes étapes pour en stocker et en obtenir.

    La monnaie est fondée sur la rareté. À l’origine, l’or était utilisé puisqu’il était difficile à obtenir. On a fini par cesser de trimballer de l’or et utiliser du papier pour représenter une certaine quantité de ce métal. Puis graduellement, le lien entre le papier et le métal s’est amenuisé. Certains pensent même que des pays utiliseraient le pétrole comme matériau rare au lieu de l’or. En crypto-monnaie, par contre, le « matériau » utilisé est virtuel. Au lieu de l’or, de ce que j’en comprends, ce seraient des paramètres d’un système d’équations, des valeurs coûteuses à calculer. Ces valeurs peuvent être liées, en utilisant la cryptographie, à un compte spécifique, détenu par un utilisateur. Ces liens, qui peuvent bien entendu varier lorsque la monnaie passe d’un compte à l’autre, sont stockés dans une chaîne de blocs qui est répliquée sur plusieurs nœuds éparpillés partout sur le web. Chaque transaction de crypto-monnaie engendre la création d’un bloc qui est relié à son précédent dans la chaîne, puis propagé. L’historique de toutes les transactions est ainsi préservé. Toutes les transactions sont publiques, mais les transactions sont anonymes étant donné que le lien entre les adresses de comptes et leurs utilisateurs n’est pas toujours connu. La quantité croissante de nœuds accroît la redondance et la fiabilité de la chaîne de blocs. Tout changement à un bloc existant devant être approuvé par une majorité de nœuds, il devient quasi impossible de falsifier l’historique des transactions, bien qu’aucune autorité centrale ne régule tout cela.

    Pour pouvoir transiger en crypto-monnaie, il nous faut obtenir une paire de clés. La clé publique permet de recevoir de la monnaie depuis un autre compte, et la clé privée est nécessaire pour envoyer de la crypto-monnaie. La clé publique n’est pas sensible du tout. On peut la partager à tout va puisque tout ce qu’un attaquant peut faire avec cette clé, c’est envoyer de la monnaie à son propriétaire. Obtenir la clé privée à partir de la clé publique risque de prendre autant de temps que la naissance de notre soleil, voire de l’univers, à moins peut-être disposer d’un ordinateur quantique qu’on n’a pas encore. La clé privée, par contre, doit être protégée scrupuleusement, car elle donne accès aux fonds dans les comptes reliés.

    Portefeuilles pour crypto-monnaie

    La première étape pour entrer dans le monde de la crypto-monnaie, c’est obtenir une clé privée représentant un compte. Contrairement à la monnaie classique, ce compte n’a pas besoin d’être lié à une autorité financière. C’est simplement une suite de nombres qui représente une clé privée. La clé, il faut évidemment la protéger, sinon toute la crypto-monnaie à laquelle elle se réfère peut être perdue.

    Pour obtenir de la crypto-monnaie, il faudra habituellement au moins deux portefeuilles: un de hébergé en ligne, fourni par un site d’échange, et un matériel ou physique, protégé, par le propriétaire. Cela permet de demeurer indépendant du site d’échange qui, en cas de force majeure, pourrait bloquer l’accès à la crypto-monnaie.

    Suivant les conseils d’un ancien collègue et ami, j’ai opté pour un portefeuilles Ledger Nano 5. Cet appareil ressemble à une clé USB, mais ce n’en est pas une, pas au sens qu’on est habitué.

    Ledger Nano 5

    Bien que l’appareil semble assez robuste, ça vaut la peine de le protéger. J’ai acheté un boîtier exprès pour ça.

    Boîtier cylindrique en acier pouvant contenir le Ledger Nano 5 pour une protection maximale contre les éléments et aléas

    J’ai reçu l’appareil lundi, 4 avril 2022. Ressemblant à une clé USB, il venait avec un câble permettant de le brancher dans un port USB de type A. J’ai branché ça et obtenu un affichage me demandant de configurer le portefeuilles. La configuration s’effectue à l’aide de l’application Ledger Live, mais en fait, cette application ne sert qu’à fournir les instructions pas à pas; la configuration se fait en matériel, sur la petite clé USB.

    L’interface utilisateur est assez simpliste: deux boutons, un pour naviguer vers la gauche, l’autre pour aller vers la droite, et les deux boutons simultanément pour confirmer l’opération en cours.

    Le logiciel Ledger Live étant disponible pour Linux, je l’ai installé et démarré sous Ubuntu. Il est distribué sous la forme d’une AppImage. C’est un fichier binaire contenant tout ce dont le logiciel a besoin, ce qui permet d’éviter la myriade de problèmes que pose la diversité des distributions de Linux disponible. Le logiciel démarré m’a demandé si je voulais configuré un nouveau portefeuilles, le restaurer ou si j’en possédais un déjà configuré. J’ai choisi la première option, puis on m’a indiqué que l’opération allait nécessiter environ une demi-heure. Ensuite, j’ai eu beau chercher, chercher, chercher et encore chercher: pas moyen de trouver le bouton sur lequel cliquer pour débuter la configuration. Eh bien il m’a fallu rétrécir la taille de mes caractères, pour tout le système, afin d’avoir accès au bouton!

    La première étape de la configuration consistait à configurer un NIP. Cela se fait avec la clé USB elle-même. On choisit chaque chiffre du NIP avec les boutons, puis on appuie sur les deux boutons simultanément pour passer au chiffre suivant. On peut mettre entre 4 et 8 chiffres dans le NIP. L’opération terminée, la clé demande de confirmer le NIP. Il faut alors l’entrer à nouveau, avec les boutons.

    Élément intéressant à noter: le chiffre initial est aléatoire, donc on ne peut pas entrer le NIP sans regarder l’affichage qui est un peu petit, pour moi du moins. Si le chiffre avait toujours été 0, j’aurais pu appuyer 3 fois sur le bouton de droite pour obtenir un 3, et ainsi de suite. Cette décision d’interface rend par contre impossible pour un observateur de deviner le NIP en comptant le nombre d’appuis sur les boutons, ce qui est une bonne chose.

    Si je pensais avoir trouvé ça pénible, le NIP, je n’avais rien vu! Oh là là! La deuxième étape consistait à configurer la clé privée qui serait associée au portefeuilles. Cette clé est générée et stockée sur l’appareil. Elle n’est jamais transférée sur le disque dur de l’ordinateur ou sur un appareil mobile, si bien qu’elle ne peut être obtenue par un logiciel malveillant. Mais la clé, il faut la sauvegarder quelque part, au cas où le portefeuilles matériel serait perdu ou endommagé. On peut l’écrire sur un bout de papier ou les puristes s’achètent un crypto steel afin qu’elle soit gravée dans l’acier! Oui oui, c’est aussi fou que ça!

    La clé est représentée par 24 mots dans un ordre précis. J’ai bien cru que j’aurais à me taper la copie minutieuse de 24 suites de caractères alphanumériques sans queue ni tête. Par chance, ça n’a pas été ainsi parce que les caractères sur l’affichage un peu trop petit auraient rendu cette tâche ardue pour moi, avec un risque élevé d’erreurs de transcription. Au lieu d’utiliser des suites de caractères alphanumériques, on se sert de mots communs de la langue anglaise. On pourrait penser que c’est peu sûr, mais on peut facilement imaginer qu’il y a au moins 256 mots dans la langue. 256 est représenté en utiliser 8 bits, et 8*24, cela donne 192 bits, plusieurs milliards de milliards de combinaisons possibles. Et il y a peut-être beaucoup plus que 256 mots, donc beaucoup plus que 8 bits par mot. Par comparaison, une serrure classique comprend gros max cinq dents, et chaque dent a peut-être une dizaine de positions possibles.

    Alors pour compléter cette étape cruciale, il m’a fallu visualiser un à un les mots, tenter de les lire (et parfois j’ai eu du mal!) et les copier sur le bout de papier. Parfois, j’avais du mal à lire une lettre, mais ce qui m’a sauvé, c’est le fait que pour former un mot existant, une seule lettre allait à cet endroit. Puis on m’a fait confirmer les 24 mots, un après l’autre. L’opération terminée, j’ai rangé le bout de papier en lieu sûr.

    Le portefeuilles configuré, après un soupir de soulagement (ouf que ça a été long, transcrire la clé), je devais indiquer à Ledger Live de s’y connecter et tenter d’en vérifier l’authenticité. Eh bien surprise, au lieu d’accomplir l’opération, le logiciel m’a balancé un message d’erreur indiquant qu’il ne pouvait accéder au portefeuilles. Tout ce que je pouvais faire, c’est réessayer et en cas de problème persistant, contacter le support technique. Ah merde! J’étais pas mal certain qu’ils allaient m’échanger l’appareil, me forçant à me retaper la configuration de la clé privée.

    Avant de faire ainsi, j’ai redémarré ma machine sous Windows et réessayé avec la version Windows de Ledger Live. Cette fois-ci, j’ai indiqué que je possédais une clé déjà configurée, et c’est passé à l’étape de l’authentification. Si je me souviens bien, ça a encore échoué et j’ai dû débrancher puis rebrancher la clé, et rentrer mon NIP. La chose faite, la vérification s’est faite avec succès. Mais je n’étais pas au bout du parcours.

    Pour accéder à de la crypto-monnaie, il faut installer des applications sur la clé. On fait ça avec Ledger Live qui doit activer Ledger Manager. Il faut confirmer, sur la clé, l’activation de Ledger Manager en appuyant sur les deux boutons simultanément. Mais parfois, ça produit une erreur et il faut alors réessayer, ce qui ne sert à rien, ou encore débrancher et rebrancher la clé. La première fois que j’ai réussi, ça m’a demandé de mettre le firmware à jour, ce que j’ai fait, puis ensuite la clé a redémarré, m’a redemandé le NIP, puis j’ai pu installer une première application. J’ai installé Bitcoin et ensuite Ethereum.

    Les applications installées, on peut ensuite créer un ou plusieurs comptes pour les crypto-monnaies. Que se passerait-il si je voulais transiger avec plus de trois types de crypto-monnaies. Je ne sais pas encore, possible que j’aurais à jongler à désinstaller/réinstaller les applications, ou j’aurais besoin de plusieurs portefeuilles matériels. Chacun avec une clé privée différente? Ouille!

    Pour accéder à mon compte Bitcoin, il faut démarrer l’application Bitcoin, mais parfois ça échoue et il faut débrancher et rebrancher la clé! Hein? Ben oui! Lorsque Ledger Live interagit avec l’application Bitcoin, on peut voir son solde, envoyer et recevoir des Bitcoin. Lorsqu’on demande de recevoir des Bitcoin, c’est là qu’on peut trouver la clé publique du compte Bitcoin. J’ai fait de même pour les Ethereum.

    Après tout ça, ce qui a pris plus d’une heure de travail et, combiné à ma journée de travail, m’a épuisé bien comme il faut, j’avais un endroit où stocker la crypto-monnaie. Pour être bien précis, le portefeuilles ne stocke pas la monnaie. Elle se trouve dans la chaîne de blocs, sur Internet. Le portefeuilles ne contient que la clé donnant accès à la crypto-monnaie que je possède. Ledger Live ne va jamais copier la clé du portefeuilles en mémoire; il va plutôt demander au portefeuilles d’effectuer lui-même toute opération de décryptage impliquant la clé privée. C’est cela qui le rend si différent d’un simple disque dur externe ou une clé USB, sur lequel on stockerait autant de clés privées qu’on veut. Même un logiciel malveillant, s’exécutant sur l’ordinateur, ne pourrait pas forcer le portefeuilles à lui donner la clé privée, seulement lui faire décrypter des informations.

    Acheter de la crypto-monnaie

    Pour transformer des dollars classiques en crypto-monnaie, il nous faut passer par un site d’échange. Mon ancien collègue et ami m’a recommandé d’utiliser Shakepay, mais il y en a plusieurs autres. Dans un monde parfait, on pourrait utiliser le site de notre banque ou caisse pour virer des fonds vers de la crypto-monnaie, mais les banques ont peur de ça, parce qu’elles ne peuvent exercer autant de régulation que sur la monnaie classique. Alors, il faut créer un autre compte, un autre mot de passe et tout. L’opération a par chance été bien simple.

    La création du compte s’est faite mardi, 5 avril 2022. Shakepay a besoin d’effectuer une vérification de base de l’identité. C’est fait en installant l’application sur son téléphone, je ne sais pas exactement comment, mais au moins, pas besoin d’envoyer une copie d’une pièce d’identité quelque part. C’est déjà beau, ça.

    Avant de pouvoir acheter de la crypto-monnaie avec Shakepay, encore faut-il transférer des dollars classiques dessus. Cela peut se faire de diverses façons. J’ai choisi le transfert Interac. Shakepay m’a donné une adresse e-mail de destination, une question de sécurité et une réponse. J’ai alors pu utiliser le Virement Interac sur AccèsD pour transférer des fonds vers mon nouveau compte Shakepay.

    La chose faite, j’ai pu acheter mes premiers Bitoin et Ethereum. Mais ce n’est pas tout. La clé privée donnant accès à cette crypto-monnaie est sous le contrôle de Shakepay. Si jamais le site cesse de fonctionner, cette monnaie est disparue, possiblement pour de bon. Certains ont tout perdu leur investissement en crypto-monnaie, il y a plusieurs façons possibles que ça se produise et il faut y penser pour essayer de se protéger contre ça.

    Shakepay vers Ledger, encore de la prise de tête!

    Dernière étape: transférer la crypto-monnaie de mon compte Shakepay vers mon portefeuilles Ledger. Ok, ça va être facile: on clique quelque part pour envoyer des Bitcoin, je copie/colle mon adresse public obtenue via Ledger Live et hop? Non non! Le site web, au lieu de me demander l’adresse et le nombre de Bitcoin à transférer, m’a indiqué ne pas pouvoir effectuer l’opération, il fallait utiliser l’application mobile. Bon, ok, on installe ça, par chance c’est disponible pour Android, pas juste iPhone, sinon j’aurais fulminé pas mal. L’application installée, eh bien j’ai cherché, cherché, cherché, pas moyen de trouver l’option pour envoyer des Bitcoin! Je ne pouvais qu’envoyer des fonds vers un autre compte Shakepay.

    Ça a fini que j’ai dû chercher sur Internet et me taper une vidéo pour trouver comment faire. Il fallait cliquer sur l’option Bitcoin pour voir mon solde de Bitcoin, puis là, il y avait un bouton Send pour envoyer des fonds. Le bouton, un peu petit, était en gris pâle sur fond blanc, pas génial du tout. Puis après, je collais mon adresse Bitcoin et ça ne fonctionnait pas. J’ai dû essayer plusieurs fois avant que l’application ne détecte que c’était une adresse de Bitcoin.

    Puis ensuite, étape critique: vérifier avec Ledger Live que l’argent a bien été transféré, ou ça s’est perdu dans les limbes! Par chance, le transfert a fonctionné.

    Bon, les Ethereum maintenant. Ah non, on ne peut pas, faut transférer minimum 0.1 ETH, ce qui équivalait grosso modo à 400$ en avril 2022. J’ai commencé petit pour tester, va falloir transférer plus d’argent pour aller plus loin et avoir des Ethereum sur mon compte Ledger.

    Est-il possible de tout perdre sa crypto-monnaie?

    La réponse, malheureusement, est oui! C’est arrivé à certains. Je cherche encore comment ça peut arriver et comment me protéger. Pour le moment, j’ai trouvé les possibilités suivantes.

    • Si la clé privée entre en possession d’une autre personne, cette dernière pourrait s’en servir pour récupérer mon argent. Si j’ai bien compris, pas même besoin d’un portefeuilles Ledger. L’attaquant pourrait installer un logiciel sur son ordinateur, entrer la clé là-dedans et hop. Pourquoi n’ai-je pas utilisé un tel logiciel alors? Parce que le logiciel est vulnérable aux malware, pas le portefeuilles matériel. Si mon portefeuilles matériel est volé, je peux encore m’en sauver, car le voleur devra deviner mon NIP pour y accéder. Par contre, si le voleur entre en possession des 24 mots sur la feuille de papier, je risque de tout perdre. Si jamais ma clé privée était compromise, il me faudra, le plus rapidement possible, générer une autre clé, et transférer l’argent (restant) de l’ancien compte vers un nouveau. C’est donc crucial de bien protéger ces 24 mots. C’est la chose la plus importante. Compromettre sa clé privée est de loin la façon la plus facile de tout perdre son investissement en crypto-monnaie.
    • La crypto-monnaie pourrait subir une dévaluation s’il en circule suffisamment. Elle repose sur le fait que les valeurs résolvant certains systèmes d’équations sont difficiles à obtenir. Si on découvre un algorithme rendant cela plus simple, ça pourrait réduire la valeur de cette monnaie. On peut probablement se protéger un peu contre ça en investissant dans divers types de crypto-monnaies, puisque chaque type utilise des équations un peu différentes.
    • Si le site d’échange (Shakepay) est compromis par une cyber-attaque, l’attaquant pourrait en théorie récupérer des clés privées. S’il reste des fonds sur mon compte Shakepay, ils pourraient être récupérés par l’attaquant. En cas de fermeture subite de Shakepay, c’est aussi possible que les clients perdent leurs clés privées. Un décret gouvernemental justifié par des mesures d’urgence pourrait bien forcer la suspension de tous les comptes Shakepay, rendant les fonds indisponibles.
    • S’il arrive quelque chose à mon portefeuilles Ledger (perte, vol, bris), il me faudra m’en procurer un nouveau et restaurer ma clé de 24 mots sur ce dernier. Si jamais je me trompe dans la clé, le portefeuilles risque de s’initialiser correctement mais pointer sur… rien du tout! Il faut avoir la bonne clé, exactement les bons 24 mots, dans le bon ordre, pour pointer à nouveau sur ma crypto-monnaie. Si je me retrouvais dans une telle situation, avant de pleurer, il me faudrait reprendre la restauration et vérifier minutieusement que je ne me trompe pas dans la clé. Probablement que je serais stressé à mort et, en cas de succès, soulagé autant qu’après un saut en parachute ou quelque chose du genre. Ouf!
    • Une erreur de manipulation stupide lors d’un transfert peut faire perdre beaucoup. Si je veux reconvertir mes Bitcoin en dollars, je devrai d’abord les transférer depuis mon compte Ledger vers mon compte Shakepay, et puis revendre les Bitcoin, pour ensuite transférer les dollars obtenus vers mon compte classique. Le transfert de Ledger vers Shakepay va impliquer une simple adresse publique à copier/coller, aucune vérification additionnelle. Je ne suis pas certain qu’une adresse inexistante soit détectée; l’argent serait alors perdu! Si, lors du copier/coller, je saute un caractère dans l’adresse, c’est peut-être assez pour me faire tout perdre. Il va donc falloir vérifier que l’adresse est bien copiée et/ou effectuer le transfert en plusieurs fois.

    À quoi ça va servir tout ça?

    Je vais faire grossir ce capital en crypto-monnaie avec le temps, pas trop vite par précaution, mais je vais en ajouter régulièrement. Ce sera un filet de sécurité si jamais le dollar crash; il restera ça. Ça peut aussi être la crypto-monnaie qui crash, comme plusieurs experts financiers pensent depuis des années et des années. La nature a trouvé pour nous la solution à la survie: la diversité. Il faut garder des dollars, pas tout jeter dans la crypto-monnaie.

    Je réfléchis et vais faire des recherches sur des stratégies qui pourraient permettre de faire fructifier la crypto-monnaie. Peut-être qu’en échangeant régulièrement entre différents types de crypto-monnaie, selon leur valeur du moment, on peut finir par faire du profit. Il existe peut-être aussi des moyens de faire des placements en crypto-monnaie.

  • Will it remain possible to upgrade laptops with SSD?

    Last year, I upgraded my sister’s Thinkpad G500 with a SSD, which greatly increased its performance. The laptop also suffered from hardware issues because of its faulty DVD drive; removing the drive surprisingly cured it.

    After this success, I was thinking about improving her boyfriend’s laptop, an HP machine that happened to be slower than the older Thinkpad. The presence of a McAfee virus scan, possibly running on top of Windows Defender, wasn’t helping much, but the 5200 RPM hard drive was definitely making startup slow.

    Figure out if we can install the SSD

    Since Apple started to make it harder and harder to upgrade their laptops, to the point it is now nearly impossible with their most recent MacBook, possibly other laptop makers can follow so it is important to verify if and how we can upgrade the hard drive before purchasing a drive!

    The best way to figure this out is to search for hard drive replacement for the laptop model, on Google. The model was HP 15-bw028ca. However, this time, all I could find was the specifications of the laptop, and some YouTube videos showing how to disassemble other similar laptops but not that one! I searched for more than an hour to find out the maintenance manual of that laptop, and then I was able to get the information I needed.

    The hard drive was installed behind a bottom cover that can be removed. However, according to the manual, only the battery and optical drive should be removed by the user. Everything under the bottom cover should be serviced only by an HP-authorized technician. Quite bad! But since that laptop wasn’t under warranty anymore, it was less of an issue. But this makes it more important to carefully evaluate if I can reliably remove that cover and put it back, without breaking it. Without the cover, the laptop may at best look ugly, at worst not hold together anymore so not work!

    Besides assessing the risk of disassembling the laptop to reach the hard drive without making it ugly or non-working, I needed to figure out the type of drive to install. The machine supports SATA 2.5″ drives, but it also accepts M2 ones. HP used SATA hard drives but M2 SSD. However, M2 requires the replacement of the connector, which is specific to HP. Getting the M2 connector is likely to be problematic, so I decided to try with a SATA SSD, since both hard drives and SSDs can be SATA. I got a 500Gb SSD on Amazon.ca and ordered it.

    Replacing the hard drive

    When I got the laptop and the SSD, I first examined the laptop a bit and figured out it looked like the one referred to by the manual I found. There was a small difference, though: no optical drive, so the laptop wasn’t the same as the one in the manual.

    First I put the laptop upside down and removed the battery, using the latches. That operation was easy as expected. Only MacBook and ultrabooks have soldered batteries that cannot be removed.

    The HP laptop upside down
    Latches holding the battery

    After that, I had to find and remove all the screws holding the cover in place. There were screws pretty much everywhere, even under the four rubber pads and under the battery. Trying to pry the cover starting from the back near the battery slot without removing the screws was a risk of breaking the cover or the chassis, making the reassembly impossible. I was thus quite worried, and more and more concerned the cover couldn’t be removed without a special tool.

    Screws can be anywhere, including center and sides

    Some attempts at removing that cover caused concerning cracking sounds. I was seriously concerned about the possibility of breaking that laptop altogether. But at some point, the cover unclipped completely, showing up the inside of the machine.

    Removed cover
    Inside the machine

    The hard drive is at the upper left corner of the above picture. It is held in place by a bracket screwed to the chassis, similar to the Thinkpad. I removed the screw and was able to disconnect the drive. Then I transferred the bracket on the SSD and put the SSD in there. Nice, it seemed to work. I was so sure it worked that I put back the bottom cover and the screws.

    Unfortunately, when I turned on the laptop, I found out it didn’t detect the SSD at all. I first started Ubuntu from a USB key: that worked but couldn’t find the SSD. I booted up without the USB key and was told to install an OS or press F2 for diagnostic. I pressed F2, tried to start a test of the hard drive, and got a message telling there was no installed hard drive. Could it be that only HP-approved drives can be installed?

    I thus had to remove the cover a second time. Before putting back the old hard drive and call it a day (my sister’s boyfriend would have to contact HP that would check his warranty, then advise him to buy a new laptop), I removed the SSD and checked the connection. First time, it was a bit too easy to connect the drive; the drive wasn’t aligned into the SATA connector! The second attempt, I felt a slight resistance showing that the drive engaged into the connector. I screwed it back in and tested, with the cover but not the screws yet. After I checked it worked and passed the auto-test from the HP diagnostic, I put back the screws; the drive was installed and detected!

    If I had just put back the old drive, without investigating the connection further, maybe it would have failed again, and I would have been stuck, not able to get the SSD working but also unable to put the laptop back into its original state. Every hardware modification causes such a risk; this must be carefully considered before attempting this. This is why I decided not to try my luck on my Lenovo Ideapad Yoga 13, which contains a too small SSD; that one is trickier to access.

    Another activation concern

    Then comes the time to install Windows 10. My plan was to use my USB-based installation medium as I did on other laptops. However, will activation work? It was possible that it would not, asking me for a product key. I could not get the product key unless I put back the old hard drive and log in to the old Windows installation, which would either require the password of my sister’s boyfriend or a way to hack the installation in order to turn on the local administrator account. Even with that product key, activation could fail anyway, requiring me to call Microsoft and try by phone. This could have gone as far as requiring HP’s custom recovery partition, or the purchase of a new product key.

    Fortunately, the simple installation worked like a charm. The installer didn’t ask me for any product key and after that, Windows was activated!

    Some testing and post-installation steps

    After that successful installation, I installed the drivers and tested the machine a bit. It was working and not crashing. I added Firefox and LibreOffice and I created the user account for my sister’s boyfriend, making sure it was set to Administrator and not Standard account.

    Restoring Enigmail

    My sister’s boyfriend is using Thunderbird and Enigmail to send encrypted messages to some of his friends. If we don’t fully restore his configuration, that means he will have to generate a new private key, notify all his friends about that new key and get back all their public keys. That is kind of annoying and inefficient, both for him and his friends. I thus wanted to restore this configuration but we didn’t know where it was stored.

    I had to install Enigmail on my machine to test and figured out it’s using the GnuPG’s keyring. That keyring is located inside the Application Data folder under the gnupg directory. I wasn’t sure he found and backed up that directory, so I plugged his hard drive on a SATA to USB adapter and got the directory back. He would thus be able to copy it at the right location, after he installed Thunderbird and Enigmail.

    Why couldn’t have I set this back up completely? Because that would require logging in to his account, which would have required his password. Then to set up his GMail account, I would need his GMail password. It is important that passwords remain secret, even if both of us knew I would not misuse the password afterwards.

  • Bumpy Windows 10 activation

    It’s been months I was thinking about a way to upgrade my parents’ computer to Windows 10. The machine was running a copy of Windows 7 that failed to activate so required a crack. I wanted to both upgrade to Windows 10 and get it a fully activated genuine copy. However, the machine was four year old, so a bit worthless to pay more than 150$ to get a OEM Windows on it, that cannot be transferred to another PC. I thus put this thing off, since the computer was working correctly.

    However, during summer 2019, I was forced to figure something out, because in January 2020, Windows 7 will be out of life (no more support and security updates), and the software program my mother was using for her taxes, ImpôtRapide, was discontinuing support for Windows 7 as well. So we needed a solution. Without my help, my parents would have been stuck going at a computer store that would be likely to tell them they are sorry, but it’s worthless to install Windows 10 and would be better (for their cash flow…) to purchase a new computer. They have some at 400$. Yeah, likely to get a uselessly slow laptop with a small hard drive. My sister got caught this way. Her old laptop on which I replaced the hard drive with a SSD is faster than her new one!

    I thought about two possibilities to solve the Windows 10 issue:

    1. Pay 150$ for the OEM copy from Microsoft. It would have worked, but if the computer broke a couple of months later, we would need, in addition to the new computer, a new OEM copy!
    2. Convince my parents to buy a new computer with Windows 10 preinstalled. This would have been far simpler for me but more costly for them. My mother told me some day she would like a laptop to transfer photos on my grandmother’s digital frame. But finding a good laptop requires time. It is easy to end up with something slow, overbloated with crapware and difficult to upgrade.

    But in June 2019, I found a third possibility: Kinguin, a website offering product keys for Windows, Office and several Steam games. There was a risk to get a key that would not activate, even worse, a completely non-working key, but if it works, it would cost 40$ for each key. I took the plunge on Sunday, July 14 2019, and purchased both keys. It would have been smarter to purchase just the Windows key and try it first, but I wanted to have both keys at the time I was at my parents’ place.

    Preparation

    Before attempting the installation, I used Clonezilla to make a full backup copy of the SSD that would be formated. The disk contained Windows 7 and a few programs. In case of a total and catastrophic failure, such as non-working activation causing repeated errors or an unexpected but super shocking hardware incompatibility, it would be possible to restore Windows 7 in less than half an hour, as opposed to fully reinstall it. Of course, that would have been temporary; a new attempt at installing Windows 10 would have been needed later.

    Even that operation sucked and caused issues, because for some reasons, the USB stick Clonezilla is installed on gets corrupted (this happened at least three times since I started using Clonezilla, most likely because of flaky USB keys, not the program itself) and some programs fail to load. The image checking program failed, causing errors saying the images were broken. I had to reformat the stick and reinstall Clonezilla on it with Tuxboot, then try again. Second time, Clonezilla was reporting the images to be fully restorable.

    I was expecting the new installation to be fully UEFI compliant as opposed to the Windows 7 setup for which I needed to fall back on BIOS/MBR because of the crack used for the activation. I wanted no trace of the MBR at all and wasn’t sure Windows would remove it, so I booted the machine using a USB key containing Ubuntu MATE, and used GParted to create a new GUID Partition Table (GPT). That pretty much destroyed all the data on the SSD.

    Installation but no activation

    After the preparation, I successfully booted another USB key, that one containing the installation medium of Windows 10 I just recreated a couple of days ago; there is now a free tool from Microsft allowing that. That contained the latest updates, so no ever-lasting installation of updates like in Windows 7. The boot happened correctly, in UEFI mode. To make sure the USB key booted in UEFI, I pressed F8 at computer startup to get the boot menu and picked the UEFI entry of the USB key. I was then able to proceed with the installation, and the product key from Kinguin worked without any issue. It was shipped as a scanned or pictured sticker with the key written on it. Using my laptop, I displayed the image and zoomed in until I could see the 25 characters and typed them.

    After the installation completed without issues, I was required to login with a Microsoft account. My mother wanted to use the email address provided by her ISP so I tried that. The system was saying she already had an account. I didn’t know she has a Microsoft account. Fortunately, she managed to remember the password and we could connect it to the new Windows 10 installation, so no need to create a new account with a Microsoft email address or attempt a password recovery.

    Then I tried to install the drivers. The Intel graphics driver from ASUS failed to install. I then figured out that all devices were working and decided not to try installing the drivers. Graphic was OK, audio was working, network as well, except maybe a quite slow Internet connection. At the time of writing this post, I was starting to question myself: maybe I should have installed the Ethernet driver.

    Then came the dreadful part: is that new installation activated? In order to determine that, I pressed the Windows and Pause keys simultaneously to access system properties, searched a bit, and found, at the bottom, a message saying that Windows was not activated. Oups! I found out a button to Activate, clicked, was offered to activate by Internet or by phone, naively tried the Internet activation, and that failed. A concerning error message was stating that the key may be used on another PC. Aouch! The system was then proposing to purchase a key on the Windows store.

    I was kind of stuck, not knowing what to try next, and forum posts I found on Google didn’t help at all. One I found was stating that Kinguin keys are from volume licenses; they may work, they may not, they may work for some time or not. Ah! No!

    In order to evaluate the extents of our losses, I tried to install Microsoft’s Office 2016. For this, I used my mother’s Microsoft account to log in to Office website and found an option to enter a product key. That time, it was possible to just copy/paste the product key. That got me a download link for Office 2016. I downloaded the program and installed it. I don’t remember if I had to copy/paste the key again at the installation program, but what I remind is that the installation was awfully long. Something seems to be throttling my parents’ Internet connection, maybe Videotron because my parents chose a lower end plan, I’m not sure. Anyhow, the installation succeeded, but activation, again, failed.

    The pain of the phone-based activation

    Next step was to try to activate by phone, before contacting Kinguin. I thus restarted the Windows activation wizard, and selected the option to activate by phone. I was asked for my country, and given a phone number. I first entered into an automated system asking me if I already activated Windows, if I replaced some hardware, etc., but no matter what I picked, I ended up at an operator. I had to identify myself to her: name, phone number, email address. Then I needed to provide the product key. This was a long and painful process, as the phone line or my parents’ handheld phone were causing sound issues. I had to repeat several parts of the 25-character sequence. But that ended and she got the whole thing checked. It was an OEM key, used by sellers like Lenovo, HP, etc., but the key was valid and usable! Phew! But it would need a phone-based activation.

    That procedure consists of stating the installation ID, which is split into 8 groups of six digits. The operator had me utter and confirm the digits, then despite my doubt about the correct communication of that awfully long sequence, we tried to generate the confirmation code that I entered into the second page. That one is also a long sequence of digits split into groups. I used the keypad to directly type the digits into the fields, no pen and paper for that! Then expecting a long revision process, I clicked on Activate and got the thing activated. YEAH!

    After a small break, a couple of glasses of water, and a short walk, I came back at the computer for the second part: Office. That one was pretty much the same principle, with different challenges. I again needed to pick a country, then got a phone number. This time, the system was fully automated. My past experience with automatic speech recognition told me that errors are perfectly possible, so I took care of speaking as clearly as possible while dictating the installation identifier, again a sequence of numbers. The confirmation code given as the response was uttered relatively fast, so I had to be careful not to miss any number. The keypad was essential to get this done flawlessly. After I entered the whole confirmation code, I tried to click the button to activate and that worked!

    Both Windows 10 and Office 2016 were now activated!

    After this installation and bumpy activation succeeded, I created a second Clonezilla image. If something bad causes the installation to be corrupted in the future, it will be easy to restore it from the image without having to reinstall and reactivate.

    Various small problems

    When my mother tried to access her Facebook account, she got a completely different UI with no access to games. This was because Chrome opened m.facebook.com instead of facebook.com. After that, the games took a long time to load and one of them failed, but this was caused by connection issues, not Windows 10.

    There was also a strange issue with the Volume icon in the notification area. The icon just disappeared the day after the installation of Windows 10. I searched for a while to figure out how to solve this. There is an option to disable system icons: all system icons were enabled, including Volume. There is a group policy to disable the icon; it was turned off. The solution was to unlock the task bar, expand it to use two rows, then the icon showed up. Coming back at one row, the icon stayed visible.

    Fonts are small than on Windows 7. I thought I was getting crazy or loosing sight because of too much time spend in front of my computer, but no, my mother also found characters to be smaller than before. We searched and searched, no way to enlarge them, besides changing the DPI scaling. But bumping up the DPI causes her Scrabble game not to fully show up on the screen. Part of the problem is the too small screen, running in 1440×900 as opposed to a full HD 1920×1080 display.

    I suspect the installation, besides activation, was too smooth. Like with my own systems, problems will happen after the fact. Hopefully, things will not be too bad, but we don’t know.

  • Ubuntu 18.10: a silent release

    Usually, upgrading Ubuntu goes well. I run the Upgrade tool which downloads new packages, installs them and then asks me to reboot. Some new versions had minor issues, for example Wayland not fully compatible with my NVIDIA card or an old version of MATE preventing the dist-upgrade, but nothing major, nothing that couldn’t be worked around. This time was different: no sound, and no way to easily get around this.

    Cannot dist-upgrade

    Sunday, October 21 2018, Ubuntu 18.10 was released since a couple of days. However, after I installed the updates through the Software update tool, I wasn’t proposed to upgrade. I had to dig into the software options and reconfigure the delivery of releases to « every release », not just the LTS. Then I reran the Software Update tool, and got the option to upgrade. However, clicking on the upgrade button did just… nothing. I tried two or three times: same result.

    Before accepting that this time, I would have to go through the clean install route, I searched a bit on Google, and found out that sometimes, the upgrade doesn’t start when updates are still available. But updates were all installed. However, running the following on a terminal found and installed a couple of additional updates.

    sudo apt update
    sudo apt dist-upgrade

    I then retried the Upgrade button, and that worked! Yeah!

    The upgrade went well, and then I was offered to reboot, which I did.

    Slow and silent

    First the new release felt a bit sluggish, then pretty flaky. First time I tried to reboot because I needed to go to Windows, to play Minecraft loaded by Twitch and recorded through OBS, the system hung up waiting for a stop job. I am getting this issue from time to time and when that happens, I have to wait for the 1min30s timeout to elapse before the stop job is forcibly killed. I don’t know what is a stop job, what job is frozen and how I could get rid of this issue for good.

    I was in my living room turning off my TV and AV receiver while my new Ubuntu setup finally rebooted, and I didn’t have time to select Windows in GRUB, so it restarted Ubuntu, and trying to restart redid the « stop job » issue! Argh, don’t tell me I’ll get this each time I reboot or shutdown now? I’ll have to wait 1min30s, maybe even 3min, just to power off or reboot. What’s the point of having a SSD if timeouts like this counter its performance benefits?  I got fed up and powered off my PC with the power button. I had to press and hold the button almost ten seconds for the machine to finally turn off, then I had to press the button 2-3 times for it to finally turn on. Maybe I’m heading towards the necessity of replacing my computer CASE, which sucks me to the point of thinking about getting rid of that damned thing and use just a laptop. But that case issue has nothing to do with Ubuntu, I should just have picked a better case, there’s nothing more to it, at least for this post.

    After my Minecraft session, which was kind of fruitful, I wanted to check that my Ubuntu installation would boot again and be able to shutdown at least once without the « stop job » issue. I thus rebooted to Ubuntu, but the system froze before showing the welcome screen. I had to press CTRL-ALT-F2 to reach a console, log in, check the syslog, nothing of interest. Then the system finally booted into X. I wanted to switch back to the console to log off, but when coming back to X, it froze again on a black screen, this time no key was working. Then another hard reboot!

    Second attempt worked: I reached the desktop. I launched the backup script for my Minecraft world, then the MKV->MP4 batch conversion script. OBS records in MKV, which is more robust against crashes, but VideoStudio doesn’t accept MKV, so I have to turn the recordings into MP4. Fortunately, FFMPEG does it without loss, by just repackaging the MPEG stream into another container. Then I wanted to organize my videos into directories, so I launched one to check it, and found THE thing: no sound!

    Checking pavucontrol, I found that my sound card disappeared. Only detected devices were my HDMI port hooked up to my monitor, and a crappy USB Webcam that could serve as very basic and rudimentary mic. I don’t use that for my Minecraft recording; I have an AKG real microphone for that! I tried to reboot to no avail. Trying to run speaker-test just hung, again, needed to press ALT-F4 to shutdown the terminal.

    Searching on Google for solutions lead to nothing except old stuff that didn’t work. Some people restored sound by reinstalling ALSA and PulseAudio. Others had to downgrade to the previous kernel. Others edited configuration files, commenting a line that is not there in my case, and adding another line. I tried to reinstall PulseAudio with sudo apt install –reinstall pulseaudio, to no avail. As a last resort, I tried to reboot with the 4.15 kernel of Ubuntu 18.04,

    After almost an hour of searching, it was more and more obvious that I needed to give up on Ubuntu 18.10 and either downgrade (which essentially means reinstall) to Ubuntu 18.04, or switch to a new distribution. I was quite annoyed, as preceding upgrades went right, and then the hardware problems start again like in the past. Moreover, the system was freezing for a second each time I hit Tab while in a terminal, before displaying completions.

    Can this work at all?

    I had to go sleeping (it was past 11 PM), go to work the day after, but I thought about it. First I needed to test if that can work! For this, the simplest solution is to test using a live USB. Monday morning, I had to resist the temptation of testing that before going to work. Doing that would have made me start late and thus finish work late in the evening. So I did my workday first. So on Monday evening, I needed to download the Ubuntu 18.10 ISO; I picked the MATE one since that’s what I’m using now, instead of that GNOME3 thing which works so so. I got the ISO, and used the disk creator tool built into Ubuntu to write it into a USB drive. However, the tool refused to format my drive: too small for the new 2Gb ISO! That old 2Gb key was just a bit too small, which is kind of annoying. As a smart man, I should have extra empty, unused USB keys hanging around, but seems I’m not smart enough for that! So I had to use an existing key.

    So I took the Clonezilla key, a 16Gb device, really too large for that small backup tool. I stored Ubuntu on it, and then I installed Clonezilla on the old 2Gb stick. Then I booted off my new Ubuntu medium, the system booted successfully, and I got sound! Ok, so at worst, if I clean install, I should get sound… except maybe if that is an incompatibility with the NVIDIA driver.

    Unlocking ALSA

    First I ironed out the NVIDIA hypothesis by uninstalling the NVIDIA proprietary graphic driver. After I reboot, I tested and still had no sound, and was back to a default VESA resolution. I thus reinstalled the driver, a bit annoyed that Nouveau cannot even basically handle my graphic card. I don’t expect full 2D/3D acceleration out of Nouveau, but at least, mode switching should work with a card bought in 2013, reaching a 1080p resolution. No, nothing. But at least, there was no incompatibility between the graphic and sound driver.

    I then dug further into ALSA, which was still detecting my sound chip. Here is the list of devices it found:

    eric@Drake:~$ aplay -L
    default
        Playback/recording through the PulseAudio sound server
    null
        Discard all samples (playback) or generate zero samples (capture)
    pulse
        PulseAudio Sound Server
    sysdefault:CARD=PCH
        HDA Intel PCH, ALC887-VD Analog
        Default Audio Device
    front:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        Front speakers
    surround21:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        2.1 Surround output to Front and Subwoofer speakers
    surround40:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        4.0 Surround output to Front and Rear speakers
    surround41:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        4.1 Surround output to Front, Rear and Subwoofer speakers
    surround50:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        5.0 Surround output to Front, Center and Rear speakers
    surround51:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        5.1 Surround output to Front, Center, Rear and Subwoofer speakers
    surround71:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        7.1 Surround output to Front, Center, Side, Rear and Woofer speakers
    iec958:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Digital
        IEC958 (S/PDIF) Digital Audio Output
    dmix:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        Direct sample mixing device
    dmix:CARD=PCH,DEV=1
        HDA Intel PCH, ALC887-VD Digital
        Direct sample mixing device
    dsnoop:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        Direct sample snooping device
    dsnoop:CARD=PCH,DEV=1
        HDA Intel PCH, ALC887-VD Digital
        Direct sample snooping device
    hw:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        Direct hardware device without any conversions
    hw:CARD=PCH,DEV=1
        HDA Intel PCH, ALC887-VD Digital
        Direct hardware device without any conversions
    plughw:CARD=PCH,DEV=0
        HDA Intel PCH, ALC887-VD Analog
        Hardware device with all software conversions
    plughw:CARD=PCH,DEV=1
        HDA Intel PCH, ALC887-VD Digital
        Hardware device with all software conversions
    hdmi:CARD=NVidia,DEV=0
        HDA NVidia, HDMI 0
        HDMI Audio Output
    hdmi:CARD=NVidia,DEV=1
        HDA NVidia, HDMI 1
        HDMI Audio Output
    hdmi:CARD=NVidia,DEV=2
        HDA NVidia, HDMI 2
        HDMI Audio Output
    hdmi:CARD=NVidia,DEV=3
        HDA NVidia, HDMI 3
        HDMI Audio Output
    dmix:CARD=NVidia,DEV=3
        HDA NVidia, HDMI 0
        Direct sample mixing device
    dmix:CARD=NVidia,DEV=7
        HDA NVidia, HDMI 1
        Direct sample mixing device
    dmix:CARD=NVidia,DEV=8
        HDA NVidia, HDMI 2
        Direct sample mixing device
    dmix:CARD=NVidia,DEV=9
        HDA NVidia, HDMI 3
        Direct sample mixing device
    dsnoop:CARD=NVidia,DEV=3
        HDA NVidia, HDMI 0
        Direct sample snooping device
    dsnoop:CARD=NVidia,DEV=7
        HDA NVidia, HDMI 1
        Direct sample snooping device
    dsnoop:CARD=NVidia,DEV=8
        HDA NVidia, HDMI 2
        Direct sample snooping device
    dsnoop:CARD=NVidia,DEV=9
        HDA NVidia, HDMI 3
        Direct sample snooping device
    hw:CARD=NVidia,DEV=3
        HDA NVidia, HDMI 0
        Direct hardware device without any conversions
    hw:CARD=NVidia,DEV=7
        HDA NVidia, HDMI 1
        Direct hardware device without any conversions
    hw:CARD=NVidia,DEV=8
        HDA NVidia, HDMI 2
        Direct hardware device without any conversions
    hw:CARD=NVidia,DEV=9
        HDA NVidia, HDMI 3
        Direct hardware device without any conversions
    plughw:CARD=NVidia,DEV=3
        HDA NVidia, HDMI 0
        Hardware device with all software conversions
    plughw:CARD=NVidia,DEV=7
        HDA NVidia, HDMI 1
        Hardware device with all software conversions
    plughw:CARD=NVidia,DEV=8
        HDA NVidia, HDMI 2
        Hardware device with all software conversions
    plughw:CARD=NVidia,DEV=9
        HDA NVidia, HDMI 3
        Hardware device with all software conversions

    The speaker-test command was hanging, but if I waited enough time after pressing Ctrl-C, it returned. Ok, cool! Running pavucontrol while pseaker-test was playing its noise, I could see it in the PulseAudio applications. Ok, so speaker-test

    What if I switch device? I tried speaker-test -D sysdefault:CARD=PCHI got an error because ALSA couldn’t open the device. Searching on Google about that, not specific to Ubuntu, lead to clues: it could be a permission issue. Trying to check /dev/dsp failed: no such file. But I could fiund the following.

    eric@Drake:~$ ls -ld /dev/snd/*
    drwxr-xr-x  2 root root       60 oct 23 08:11 /dev/snd/by-id
    drwxr-xr-x  2 root root      100 oct 23 08:11 /dev/snd/by-path
    crw-rw----+ 1 root audio 116,  9 oct 23 08:11 /dev/snd/controlC0
    crw-rw----+ 1 root audio 116, 15 oct 23 08:11 /dev/snd/controlC1
    crw-rw----+ 1 root audio 116,  8 oct 23 08:11 /dev/snd/controlC2
    crw-rw----+ 1 root audio 116,  6 oct 23 08:11 /dev/snd/hwC0D0
    crw-rw----+ 1 root audio 116, 14 oct 23 08:11 /dev/snd/hwC1D0
    crw-rw----+ 1 root audio 116,  3 oct 23 08:11 /dev/snd/pcmC0D0c
    crw-rw----+ 1 root audio 116,  2 oct 23 19:23 /dev/snd/pcmC0D0p
    crw-rw----+ 1 root audio 116,  4 oct 23 08:11 /dev/snd/pcmC0D1p
    crw-rw----+ 1 root audio 116,  5 oct 23 08:11 /dev/snd/pcmC0D2c
    crw-rw----+ 1 root audio 116, 10 oct 23 08:11 /dev/snd/pcmC1D3p
    crw-rw----+ 1 root audio 116, 11 oct 23 08:11 /dev/snd/pcmC1D7p
    crw-rw----+ 1 root audio 116, 12 oct 23 08:11 /dev/snd/pcmC1D8p
    crw-rw----+ 1 root audio 116, 13 oct 23 08:11 /dev/snd/pcmC1D9p
    crw-rw----+ 1 root audio 116,  7 oct 23 08:11 /dev/snd/pcmC2D0c
    crw-rw----+ 1 root audio 116,  1 oct 23 08:11 /dev/snd/seq
    crw-rw----+ 1 root audio 116, 33 oct 23 08:11 /dev/snd/timer

    Ok, interesting, only root and users of group audio can access these devices, and thus play sounds. Am I part of the audio group? I tried groups eric and found out I wasn’t. But on my HTPC still running Ubuntu 18.04, I was! Lucky I had that HTPC, otherwise I would have been forced to reboot to the Live USB to check, or boot another machine with it, like my ultrabook.

    Ok, so what if I add myself to the group?

    sudo usermod -a -G audio eric

    I had to relog, and then got sound through ALSA using Speaker-Test! Yeah! But still no PulseAudio! I tried to reboot, not just relog, to no avail. Ok, at least I can reconfigure Audacious to play through ALSA, but I suspect I’ll get trouble for YouTube and Spotify for which I cannot configure audio output.

    Unlocking PulseAudio

    I tried all sorts of things to debug this. I was able to get some logs out of PulseAudio several ways, but the most useful way was through SystemD. In Ubuntu, PulseAudio is started by a userspace variant of SystemD. There is a command allowing to get the logs: journalctl. I couldn’t figure out a way to filter so I ended up calling journalctl -a -f on one console, and then pulseaudio -k on another to force PulseAudio to restart. Then I was checking the produced log. I ended up finding out errors. The system was not able to communicate through DBus using a certain socket. I started to think that each time the system freezes, this is because PulseAudio tries to emit a sound, has to wait and timeout.

    I couldn’t find out how to reconfigure DBus, or understand how PulseAudio and DBus interact enough to troubleshoot this. I was quite stuck, and all I could find was deprecated information. There was a bug report that seemed to affect Ubuntu 18.10, but no solution. It was past 10 PM and I was on this for almost two hours, trying and searching to no avail, more and more pissed off and risking to be so angry that I would end up having trouble to sleep. I was about to give up or reinstall from scratch.

    I ended up fed up and decided to completely get rid of PulseAudio using sudo apt purge pulseaudio. After a reboot, sound was WORKING! What??? How come MATE plays sound without PulseAudio? Audacious was working, and VLC as well. But not Firefox for YouTube, and not Spotify.

    Before accepting the sacrifice of YouTube and Spotify to avoid a clean install, at least for now, I tried to reinstall PulseAudio. For I really don’t know why, the sound continued working, and PulseAudio displayed my internal audio device now. YouTube and Spotify resumed working. I read posts about some people that got the issue fixed and it came back the next reboot. Ok, let’s reboot and hammer that bug for good, then! I rebooted, and sound was still working. I still don’t fully understand why.

    Either something messed into the groups I was member of and PulseAudio got screwed up because I lost permission to use ALSA, either some configuration files needed to be updated but APT decided to keep my current versions. Purging PulseAudio removed the configuration files and reinstalling reverted to sane defaults. At least, sound is working now.

    Even better, the system seems more responsive now and didn’t freeze on startup. It really seems that PulseAudio and ALSA had trouble communicating, causing these hangups.

    Why not a clean install?

    Because my home directory lives on the same partition as my Ubuntu install. Any attempt to put my /home on a separate partition leads to insufficient disk space after a time. I have a 250Gb SSD shared with Windows so I cannot put 50Gb for my /home and 30Gb for just Ubuntu. One simple solution would be to move my /home to an hard drive. As long as Ubuntu is on a SSD, I’ll have a fine boot time. Or I would need a way to use SSD only as a cache and put everything on the hard drive. I could also come back to the dual SSD strategy: one SSD for Windows, one for Ubuntu. I’ll think about it, but at least I don’t have to do anything short term. Maybe I can wait and replace everything, and get a better case to fix the power button hickup at the same time…

    If everything had failed, before the clean install, I could have tried to restore my Clonezilla image of my SSD, and that would have got me Ubuntu 18.04 back to normal. In case of failure, then I would have had to clean install 18.04, 18.10 or something else, or just give up on Linux for the moment. At least this is not necessary anymore.

  • Trying to salvage a lemon

    My sister got a Thinkpad G500 from Lenovo pretty much at the same time I purchased my Ideapad Yoga 13 from the same brand. My Ideapad, while suffering from battery life and wi-fi issues and being stuck with a too small SSD, still performs reasonably. It still boots fast and doesn’t exhibit hardware issues. The upgrade to Windows 10 seems to be the change that made it worse, but it still works… compared to the G500 of my sister. That machine was painfully slow and, I discovered, suffered from various hardware issues I didn’t know about.

    Assuming the only issue was slowness, I proposed my sister to replace the hard drive of the laptop with a SSD. Before doing so, however, I checked in the user’s manual of the laptop how to replace the drive. It was possible to do it by removing the back cover of the machine and unscrewing the drive, no need to disassemble the keyboard, remove the memory, the display, etc., like on my Ideapad or even worse, on a MacBook. If there is too high risk of breaking something while dissassembling the laptop to reach the hard drive, it is better to leave that machine alone and invest in a new one later on. This is what I chose to do for my Ideapad, because of brittle keyboard clips that could completely break, preventing the keyboard to hold on afterwards. I also needed to make sure a 2.5″ SATA SSD would fit, because maybe I could need a M2 instead. If something intermediate is needed, like a mSATA, upgrading is less worth it. A SATA or M2 SSD can at least be reused in another system if the target laptop fails later on.

    The SSD doesn’t fit! WHY?

    I started to work on the laptop January 5, 2018. I first removed the battery, unscrewed the back cover and located the hard drive. It was attached to the system using a bracket screwed into the chassis. I removed the screw and was able to disconnect and pull off the drive. I then needed to unscrew the bracket from the hard drive and screw it to the SSD instead. That part didn’t work well. No matter how hard I was trying, I wasn’t able to fit the screws. I thought the laptop had a 1.5″ hard drive instead of a 2.5″, but no, it ended up working. The SSD had something written on both sides so I was trying to install the bracket on the wrong side. After I was able to screw the bracket, I connected the SSD to the laptop.

    After I installed the SSD, I put back the cover and the battery. Then I turned on the laptop and booted using a USB key containing Ubuntu Live. I used that to run a SMART check on the SSD, making sure it wasn’t flawed right from the start. This is more likely to happen with an hard drive, but it CAN happen with a SSD as well. Doing a self-test before installing anything can thus save a lot of time.

    Stuck at Lenovo logo, requiring RMA

    After that check, I inserted my USB-based Windows 10 installation medium. I got this medium from Microsoft, it can now be downloaded freely as opposed to previously, requiring the purchase of a CD or DVD. Getting the medium is now easy, the challenge is to get the activation working now.

    But before tackling the activation issues, I had to make this laptop boot the USB medium. Instead of booting, the laptop just froze at the Lenovo logo. The only thing I could do is hit ctrl-alt-delete, get a blank screen, and the logo back again. I tried to power off the laptop, power it back on, to no avail.

    I searched on Google and found other occurrences of this issue. Several people are experiencing that problem, with no other solution than contacting Lenovo’s technical support and get the laptop replaced… when it is under warranty. It seems more and more that when warranty is over, a laptop is now a piece of crap that is just good to be thrown away, which annoyed me quite a lot. Fortunately, after a couple of attempts, removing the battery, putting it back in, rebooting again, again and again, I got past the frozen Lenovo logo. After that hurdle, I was able to install Windows 10.

    Windows 10 asked me to connect to a Microsoft account. I used mine in order to perform the installation, but I knew I would need to do something to hook up my sister’s account, so she could login without me having to give her my main password.

    Activation working without efforts? Strange…

    After I finished installing Windows 10, I noticed from the system properties it was activated. Cool. However, I found that without even asking me, the system used the same product key as my own Lenovo Idepad Yoga 13 ultrabook. I was worried that the Microsoft’s tool creating the USB medium customized the installation USB key with my product key, so I was attempting to activate Windows on multiple computers with the same key. Maybe at some point, Microsoft will detect that and deactivate one of the two copies, either mine or the one of my sister, after she got her laptop back.

    After significant amount of wasted time searching on Google, I figured out that this situation happened to others. It seems that the product key is used by several laptops of a given brand. Either the key is hard-wired somewhere in a read-only store of the machine, maybe the trusted platform module, or there is a registry at Microsoft of OEM machine ids mapped to product keys. Anyhow, my sister got a fully activated Windows 10 without any effort from me.

    I was worried I would need to have her purchase a new license key or reactivate her current license via phone. Or maybe even worse, the current license would work for Windows 8, not Windows 10, and my sister would prefer to get back Windows 8. All my concerns went away with this effortless activation, but keep in mind things will not always be as smooth.

    Looping Windows Update

    That one caused me a lot of wasted time. There was a bug with one of the updates that failed to install. However, the update was partially installed. The failure caused Windows to retry installing the update each and every time the computer was shut down or rebooted. This pretty much wiped out the benefits of having a SSD because of increased shut down, reboot and even boot time, since at boot, Windows was finalizing the installation of the faulty update and failing!

    I searched a long time for that one, tried to manually install the update to no avail. Some forum posts were referring to that problem without a known solution. Sometimes, it worked at some point after a lot of attempts. Other times, it required reinstallation. I even read a post from a user who called Microsft, got a new ISO image of Windows 10, installed that and the update worked! But then, why is there a medium creation tool if we need to get an ISO from elsewhere? Will everybody reinstalling Windows 10 from scratch need to call Microsoft to get that alternate ISO? Really? That is a serious bummer in my opinion. I thought about asking a link to that ISO, but that would have given me the English version while my sister wanted the French one.

    At some point, I got so pissed off that I started searching a way to disable the automatic updates. This is possible by changing group policies… on the Pro version of Windows 10. My sister had the Home version. But the problem didn’t happen on my systems, probably because I started from Windows 8, then got 8.1, then 10. Maybe I’d need to go that same route on that Thinkpad?

    Fortunately, there is a tool from Microsoft called Show or hide updates. I installed that tool, and told it to hide the offending update. That fixed the issue without having to tentatively reinstall, call Microsoft or try to install Windows 8, then upgrade to 8.1, then upgrade to 10!

    Short-term laptop to be replaced after warranty?

    What happened after the installation of Windows 10 pretty much lead me to believe what the section’s title states. I was really shocked and annoyed and questioning my trust against Lenovo. A PC is not like a simple appliance you just plug in and start. It has settings, it has applications installed, there is sometimes even a physical configuration to get used to. I cannot afford having to fully replace that configuration every year or two! That’s a serious non-sense, and not counting the very bad ecological impact of such a short-sighted offering. I know, it was not my laptop, but my next laptop could very well suffer from these more and more common flaws.

    The freezing at Lenovo logo was just the tip of the iceberg! I then discovered that only Windows 8 drivers were offered for that Thinkpad G500 on the Lenovo website, no Windows 10 drivers at all. However, the preinstalled drivers coming with Windows 10 allowed the machine to pretty much work. Later on, I also found out that the DVD drive was also broken, not detected at all by Windows. Moreover, the Webcam was broken, showing a black screen.

    Wi-fi started to go bad, the mouse pointer started to move erratically, the machine just became totally unusable. I had to plug in that laptop to an external keyboard, mouse and even an Ethernet cable.

    Again, for the DVD drive and Webcam, the only solution was to contact Lenovo and get replacements, IF the laptop was still under warranty. At this point, I was stuck. Without another idea, I would have had to give up and tell my system the best solution is to throw that laptop away.

    Pulling some things off

    I noticed a black tape at the place of the Webcam. Maybe that tape is not supposed to be there and can be removed. I removed it and started the Camera application again: I got an image, yeah!

    I then found out that some people were having issues with both the DVD drive and the freezing at boot. Could these two be related? If the DVD drive is flaky, it can slow down or even prevent boot as the BIOS/UEFI will try to query it in order to figure out if a disk is inserted.

    The solution was quick and simple: just remove the optical drive from the machine, yes, really, pull it off. After I did that, the laptop booted flawlessly. I tested it several times, also tried to reboot, without any freezing issue. It also seemed, although I didn’t benchmark it formally, that the boot was faster. There would be a hole in the laptop casing instead of the DVD drive, because I didn’t have any other drive or something dummy to go into the bay. But at least, it would work.

    Transferring ownership without my password

    A small but significant step remained: how to allow my sister to log in to her « new » laptop without giving her my personal password in order for her to create her Microsoft account, or me getting her password to connect to her account? There are two ways to solve this cleanly, and I implemented both to be sure.

    1. Create a new account based on a known email address linked to a Microsoft account. I knew my sister’s email address and was sure she used that to create her Microsoft account, so I just had to create the account with that address; no need to enter the password until logging in. There are two pitfalls though. Firstly, the person needs to be connected to Internet for the first login, and not sure wi-fi will work, maybe just wired Ethernet since you need to be logged in to set up wi-fi! Secondly, the created account is not Administrator by default; I had to fix that so my sister would be able to install new programs on her machine.
    2. It is still possible to create a local account, so I did it and set a dummy password my sister can change after, or remove the local account altogether. Again, I needed to make sure the local account was in the Administrators group; by default it is not!

    My sister was amazed at the speed increase we achieved by replacing the hard drive with a SSD. She thought that laptop was good for the thrash can before I fixed it. Even funnier, later on her boyfriend got an HP 15-bw028ca that although more recent, happened to be slower than that old fixed Thinkpad!

    Eventually that HP piece of junk will benefit from a similar treatment. Maybe that will deserve another post.

  • Spurious mail delivery errors

    A few weeks ago, I started to receive email containing error messages about the delivery of some mails I didn’t send. The contents of such emails looked like spam, but why weren’t they detected by the anti-spam functionality of GMail? Maybe spammers found a new way to send their junk that circumvents current filters. But few weeks later, the annoyance persisted. I was receiving at least one of these emails per day, sometimes several per day. I started to suspect some people hacked into my GMail account and were using it to send spam, but I couldn’t find any trace of these in my « Sent » folder. Maybe they can circumvent it as well. Will I have to change my password just in case? And what would  tell me they wouldn’t hack again?

    Friday, April 14th, I got fed up of this. First, do these come from the same sender or group of senders? If it does, I could block these addresses. Otherwise, there is a problem with GMail that would need to be solved eventually, otherwise I would have to switch from GMail to some other email service. Looking at the sender’s address, I found out the message was coming from something @ericbuist.com. Could it be because my mail account from my Web host was misconfigured?

    I logged onto my HostPapa cPanel and reached the mail options. I found out that anything @ericbuist.com not corresponding to a valid email account is sent to a default email address. As a result, spammers in need of a fake origin email address can take anything @ericbuist.com in the hope this won’t correspond to a valid address. I thus reconfigured the default route to return an error email instead of redirecting the message. I also found out that besides redirecting traffic to my GMail account, the HostPapa mail service is keeping a copy of the messages. I thus had 250Mb of junk emails there that I deleted to free space. Although the disk space is unlimited on my HostPapa, if every customer abuses it by leaving junk on their account, HostPapa will have to impose quotas at some point.

    I didn’t receive other emails about mail delivery failures after that. Unfortunately, this is not the only cause of such problems. Other people had issues with that because they forwarded all their GMail emails to a service sending SMS, and the service went down. They had to disable that forwarding from their GMail accounts. Things get worse when other email addresses are redirected to a central email account. All these can be the cause of spurious emails and thus need to be checked in case of issues.

  • Bumpy Ableton Live session

    Yesterday, I tried upgrading to latest Ableton’s Live, the 9.7.1 version. Everything went well, but I got other issues, not related to Live, that made my work session quite bad and frustrating.

    S/PDIF not working great

    A month ago, I got a new audio interface: the Focusrite’s Scarlett 18i20. This amazing device provides eight analog audio inputs and 10 outputs. This is far from the advertised 18 inputs and 20 outputs, but these include S/PDIF and an add-on card that plugs into the optical ports of the interface. Anyway, 8 inputs is more than enough for my needs. I have difficulty playing one instrument reliably, so I won’t start playing multiple instruments at the same time, at least not now!

    I didn’t have enough long audio jack cables to plug my Novation’s Ultranova (two channels), my Korg’s EMX (two channels) and my Nord’s Drum (1 one channel), so I decided to try hooking my Ultranova through S/PDIF instead. For this, I used a RCA cable I had got somewhere I don’t remember. I plugged the S/PDIF coaxial output of the synthesizer to the appropriate input of the audio interface, then fiddled with MixControl to figure out HOW to enable S/PDIF. Easy, I thought: just set up one entry in the Mix 1 to route S/PDIF L to left channel and S/PDIF R to right channel. The Mix 1 mix was already routed to the two monitor outputs of the interface. With that, I should have obtained sound from my Ultranova into my audio monitors. No, nothing! I verified that the S/PDIF output was enabled from my Ultranova: it was.

    I tried, checked many times, searched on the Web, ok, set the sync source to S/PDIF instead of Internal, from MixControl. Did it, no result. I spent at least half an hour trying, checking, trying again, to find that the volume of my Ultranova was turned all the way to minimum. Turning up the volume solved it!

    BUT I started to hear cracking sounds from time to time. This happens especially when playing long notes with pad-style sounds. That means S/PDIF doesn’t work well out of my Ultranova, in my audio interface, or that requires a special cable I don’t have. But then WHY is the S/PDIF the exact same shape as an RCA connector?

    There is no solution for the moment, except using the analog jacks and not being able to plug my EMX, Ultranova and Drum at the same time.

    Jumpy mouse

    While trying to work with Ableton’s Live and the MixControl, I had to cope with too small fonts all the times. I ended up using Windows zoom (Windows key plus +). But regularly, the zoom was jumping all around. I figured out that this was the mouse pointer that was regularly moving around without obvious reason. Ah, this is why I am now literally constantly loosing the pointer, forced to bring it back at upper left corner of the screen almost each time I want to click on something! The pointer is really jumping around, I’m not getting crazy! This made working with the mouse a real pain, similar to what I experienced with the old Mac my brother’s wife gave me a year ago. I thought about running Live on that Mac, because many people pretend that Mac’s are more stable for music production, but the machine is way way way too slow for that, I just forgot and never tried!

    I ended up trying with another mouse, that seemed to be a bit better, but I realized that the right button was completely non-working!!! Why the hell did I keep this stupid mouse then? I threw it in the thrash can and put back the first one. Then I figured out that putting the mouse on a piece of white paper helped, making it a lot less jumpy.

    Windows update restarting computer while I’m using it

    Windows 10 sometimes automatically restarts the computer to apply some updates. Up to now, this only happened while the machine was idle. Well yesterday, it happened right in my face, while I was working with Live! I got so pissed off by this that I tried to disable this really bad functionality. I fortunately figured out a way to disable these forced updates. It was relatively easy, although it caused me trouble because my Windows is in French and the procedure was in English. If this procedure doesn’t work and spurious reboots happen too often, this may force me to downgrade to Windows 8 or Windows 7, or switch to Mac and have constant trouble with too small fonts. This could be a dead end case leading me to stop using my computer, at least stop trying to make music with this.

    Slower and slower machine

    My main computer is on a desk while my music gears are on a table on the opposite wall. I tried to link them together using a long USB cable and a hub, but that failed with crashes from Ableton’s Live. However, my attempts were with the audio interface built into my Ultranova. Maybe I’ll have more luck with my Focusrite, if the cable and hub are stable enough. Why an hub? Well, this is to get a keyboard and mouse next to my music table. I will also transport video through an HDMI cable and get a screen nearby as well.

    But for now, I ended up having to use my Lenovo’s IdeaPad Yoga 13 ultrabook for attempts at music production. This worked relatively well, but the machine is starting to be slow since I updated it to Windows 10. Searching on forums gives no result, except other people are experiencing performance problems, sometimes on Windows 10, sometimes on Windows 8.1. Starting Live is now taking almost 45 seconds on this machine. Fortunately, the program is responding correctly for now, until of course I add enough tracks and effects to my Live set to make it choke up like crazy. I guess this will happen if I go far enough in music production.

    Difficulties with music production itself

    Creating the track I had in mind caused me great trouble. While not super complex, it is not a trivial repeat drum beat. I managed to play it a couple of times, started the recording on my EMX and messed it up completely. I tried again, messed it up again. I cannot play it reliably unless I try 25 times and more. The workaround is to correct notes, but this is quite tedious on the EMX. Tired of this, I tried to record MIDI using my Ultranova as a source and Live as a sequencer. But even from Live, fixing the incorrect notes was a real pain. I experimented with the quantization which also didn’t work correctly.

    There is no well-defined workflows and no comprehensive tutorials about music production. All I can find is case-specific pro tips, sometimes involving plugins I don’t want to install yet. I’m just overwhelmed with Live itself, having to constantly check and redo what I am doing, this is not a great time to complicate stuff with plugins.

    Conclusion

    Although I am having less and less fun with all this for the moment, I feel I can manage to get something good out of it. If I gave up because of difficulties, I would not have been able to get a Ph.D, to keep my job for more than seven years and to create a modded Minecraft map.

  • Shocking problem with audio channels

    A couple of months ago, I bought myself a condenser microphone to improve the quality of my recordings in my Minecraft videos. However, such microphones require a XLR connection sending phantom power. An audio interface or mixer is required to power such microphones and get captured audio out of them. My first setup was a bit convoluted and required two cables going from my computer desk to the table on which I installed my music production gears:

    1. My microphone is on my computer desk and linked to my mixer with a cable running on the floor.
    2. My mixer is sending phantom power and getting the microphone’s audio. It gets a mono signal and spreads it to its two output channels. From the mixer, it is possible to adjust the microphone’s volume as well as its position in the stereo image.
    3. My mixer is sending output, including microphone and other sound devices, to my Novation’s Ultranova.
    4. My Ultranova is linked to my computer through a USB cable which also has to run on the floor.
    5. The audio interface built into my Ultranova is used to turn analog sound coming from my mixer into digital audio.

    After some changes in my home office, I had to move the table with music gears further from my computer desk, which prevented me from using this setup until I get longer cables. I may instead end up with a second computer dedicated to music production, which will make controlling Ableton’s Live easier than having to go back and forth between my music table and computer desk. I then needed a new solution for my microphone setup.

    Luckily, I have a M-Audio FastTrack Pro interface I decided to give a new shot. The interface had issues with Ableton’s Live, making the software crash and misbehave intermittently. The issue can come from the interface itself, the ASIO driver, Windows 10, Ableton’s Live or something else. There is no way to track it down, this is why I switched to using my Ultranova as the audio interface. But maybe, I thought, the M-Audio FastTrack Pro would just work for that simpler application.

    I thus put it on my computer desk, plugged it through USB, plugged my microphone in the first input and turned it on. I made sure the first input was configured in Instrument mode, turned on phantom power and then performed a test. I had a voice over to record at the end of an in-progress Minecraft video. For this, I usually use Corel’s VideoStudio X8.

    However, when I listened at the recording, sound was correct, but it was coming from the left channel only. It didn’t take me long to realize Corel’s VideoStudio was accessing my audio interface as a stereo device. The interface was then simply and predictably providing stereo information: the left channel coming from the first input, right channel coming from the second input. Nothing is plugged in the second input? No problem, the interface just provided silence. This is simple, logical, but today’s software expect more chaotic behavior: VideoStudio was assuming the interface would magically duplicate the two inputs! Apparently, some low-end USB microphone just do that! I also realized that my recording software would react the same way; my voice would play just on the left side.

    Searches on Google only gave me unacceptably complicated solutions.

    • Post-process audio in another software tool like CoolEdit Pro, Audacity, Sound Forge, to turn the stereo file into a mono one. That would have forced me to figure out the name VideoStudio gave to my voice over, maybe even exporting the clip manually from VideoStudio to a WAVE file, find the file in the audio editor of my choice, search forever to figure out how to make the file mono, save the file back somewhere, return to VideoStudio, find the file there, import. If I had to do this one time, I would do it and that’s it. But I would have to repeat all that for any voice over I make with that new setup!
    • Encode the video with mono audio. Besides requiring a lot of tedious manipulations in VideoStudio (click there, find that option, click there, there, there, there, etc.), this is unacceptable as my game sound is stereo and I want to keep this.
    • Insert a Y splitter cable linking my microphone to both inputs of my audio interface. That could work in a RCA or jack world, but I’m not sure at all about the results with XLR plugs delivering phantom power! Of course, nobody will have accurate information about that. According to my very far memory of electricity I learned in physics, both XLR inputs would deliver a 48V signal, resulting into a circuit with two parallel paths delivering 48V, so the output of the Y would get 48V, not 96V, but maybe I was wrong, and that would just blow up my microphone. I would also have to order this Y splitter cable on eBay or Addison Électronique and wait for it several days, or go at Addison store, which involves a never-ending bus trip for me.
    • Some forum posts were suggesting that the software tool is responsible for correctly configuring the audio interface. If it doesn’t, I have to switch to something else. That would mean I would have to use one tool to capture video, a second tool to capture audio and manage to sync up the tools in some way or another. That means having the two tools side by side and rapidly clicking on the record buttons, hoping they start simultaneously. That’s stupid, crazy, inefficient, and I really hate that people propose, adopt and accept such solutions, because that’s not so bad for them. This is bad, because computer is all about automation, and should not force human beings to repeat stupid and brain-killing tasks!
    • According to my research, some USB microphones will deliver a stereo signal to Windows, which will just avoid this issue. I could thus switch to such a microphone, forgetting about my actual device. I really disliked that, because I didn’t want to replace an already-working microphone with a potentially inferior one. And what would happen with my actual microphone? Well, maybe my brother would make a use of it in his jamming room. Quite little consolation…
    • Maybe another audio interface would provide a better treatment of this issue. I could for example try with the FastTrack Solo interface which has a single input, so no obvious reason to deliver stereo data. However, I  had no certainty about if and how that would work, I would have had to try my luck. Maybe my brother could help me out if he has the Solo M-Audio interface, maybe not, I didn’t remember which one he had.
    • My friend suggested me to use my mixer as before. That would require me to unplug all wires from my mixer, moving it on my computer desk for recording stuff, then moving and plugging my mixer back on my music table to play some music. Quite annoying.
    • My friend suggested me to use the inserts on the M-Audio interface. This quickly appeared to be an hard task, as making use of this requires custom cables designed for inserts. In particular, I would need a Y splitter starting from a TRS balanced jack into two separate mono jacks! Most jack Y splitters just duplicate a stereo signal. The only TRS Y splitters I could find were on eBay.

    I was quite desperate and about to give up on recording or switch back to my H2N, which works but gives recording with a lot of background noise. My last hope was Virtual Audio Cable. Tailoring it to my needs required a bit of trickery, but that ended up working, so I purchased a license for it.

    From stereo to mono with Virtual Audio Cable

    First piece of this intricate puzzle can be found by right-clicking on the Windows mixer in the task bar and selecting recording devices.

    Capture d'écran 2016-08-20 21.27.42Double-clicking on the M-Audio’s line device and accessing the last tab results into the following.

    Capture d'écran 2016-08-20 21.27.47The default input setting is on two channels, thus stereo. Interesting. What if I switch this to mono? Wouldn’t this be enough to indicate both VideoStudio and Bandicam to record a mono track? If they simply use default settings, that could work, no? Well, no, because the M-Audio driver doesn’t accept other settings than 2 channels! I tried with both Windows builtin driver and the M-Audio one: same result. I probably need a better audio interface. But that is enough for DAWs such as Ableton’s Live, who are able to pick and choose which channels to record on.

    I thus had to implement a patch using a virtual cable. For this, I accessed the second tab of the M-Audio line device which allows to listen to the captured audio. However, instead of feeding the captured audio to the default device as most people would do, I routed it to a virtual device provided by Virtual Audio Cable.

    Capture d'écran 2016-08-20 21.32.55That Line 1 entry appears in both playback and recording devices. This is a virtual cable that can be used to transfer audio from one process to another. Based on this reasoning, I found the Line 1 entry in my recording devices and made it the default recording device. In my case, it is called Mic 1 because I messed in the control panel of Virtual Audio Cable, but that’s not necessary.

    Capture d'écran 2016-08-20 21.33.13Hoping for a miracle, I double-clicked the virtual recording device, accessed the last tab and clicked on the drop-down menu for channel selection. I was then able to select a 1-channel input!

    Capture d'écran 2016-08-20 21.33.19I then tested and that finally worked! Windows « plays » the captured audio into the virtual cable, which coerces it into mono, which can be « recorded » by software programs. After a lot of frustrating research with less and less hope for a solution, I ended up with stereo recording again. I had to purchase the full version of Virtual Audio Cable for this to work without the annoying « Trial » message in my recorded sound, but at least, I didn’t have to wait for a Y splitter cable ordered from eBay or try my luck with USB microphones or new audio interfaces, without being sure it would solve my issue.

  • Ubuntu 16.04 almost killed my current HTPC setup

    Yesterday, I tried to upgrade my HTPC running Ubuntu 14.04 to the new LTS 16.04. That almost went smooth, but some glitches happened at the end and some changes prevented my Minecraft FTB server to start again. The problems are now solved, but I was wondering if I would be able to get this working again.

    I had two hopes with this upgrade: get an intermittent awful audio glitch fixed and have the ProjectM visualization work again. From time to time, when I start the playback of a video file, I’m hearing an awful super loud distortion instead of the soundtrack. I then have to restart playback. Usually, that’s enough, sometimes, I have to restart it twice. Fortunately, audio doesn’t go crazy during playback. ProjectM visualization started to fail, I think since Kodi 1.16. It just doesn’t kick in, leaving me a blank screen. At least Kodi doesn’t crash or freeze as some versions of XBMC were doing when ProjectM was unable to access Internet reliably.

    CloneZilla failing to start

    The week before the upgrade, I wanted to backup the SSD of my HTPC using CloneZilla in case some problems happened. I used an old version I had burned on a CD because I thought this 2009 HTPC wouldn’t boot USB sticks. Well, that old version, although working on my main PC, failed to start on my HTPC. It was simply freezing without any clue of what was happening. Before trying to download the new version and burn it on a CD, I noticed that my external USB hard drive was showing up in the boot up options when pressing F8 at computer startup. I thus tried to boot my CloneZilla USB stick running a more recent version and that worked. I don’t know if my HTPC was always able to boot off USB, maybe this capability got added by a BIOS upgrade. That was a good thing, and allowed me to perform my backup.

    Dist-upgrade or clean install?

    Several people on forums recommend to perform a clean install, claiming that too much things changed from one version to the other. That may be true in some cases, and that’s probably the safest route, but unfortunately, the clean install doesn’t always detect the drives to mount, requiring time-consuming modifications to /etc/fstab (with copy/pasting of drive UUIDs) and then I would have to figure out what packages were previously installed and reinstall them. I also have a couple of Cron jobs performing automatic backups of my Minecraft worlds that I would need to recreate.

    Instead of doing that, I tried to use the Update Manager to perform a dist-upgrade. Unfortunately, by default, the tool won’t go from one LTS to the other. You have to go all the way through 14.10, 15.04, 15.10, then 16.04! Each dist-upgrade would have taken at two hours, making this process a really painful non-sense. Instead, I tried calling update-manager -d and got the option to go from 14.04 to 16.04!

    During the installation, I thought that if the power supply of this relatively old system died during the process, the system would probably be unrecoverable, requiring a backup restore or clean install. Aouch! Luckily, no such thing happened.

    TeXLive broken

    During the dist-upgrade, I got some error messages because the updated TeXLive-related packages couldn’t be configured properly. Why is TeXLive installed on this HTPC? I don’t remember exactly. I don’t need to compile any LaTeX document on this machine so this didn’t seem an issue at all for me. I just asked the installer to ignore the errors and noted down to myself to delete the TeXLive packages after the upgrade to be sure not to run into issues if, for some obscure reasons, I wanted to compile a LaTeX document later on.

    Failed dist-upgrade

    Unfortunately, the dist-upgrade aborted with an error, no accurate information, just a message telling that the dist-upgrade failed. Argh! The system couldn’t shutdown or reboot anymore, even when running sudo reboot from the command line. I was so frustrated that I considered shutting down this machine, which caused me issues after issues since more than seven years, and never turn it back on again. If I weren’t able to recover from this failure, I could however have restored my CloneZilla image after taking a break from this catastrophic upgrade. In other words, everything wasn’t lost.

    I tried pressing the power button a couple of times, the screen became blank and remained blank for a few seconds, then the stupid machine rebooted. At least, the broken Ubuntu installation started up to the GUI. Assuming the main issue was this TeXLive glitch, I opened a Terminal and tried to remove the TeXLive package: sudo apt-get remove texlive. This failed. Apt-get was reporting errors about the TeXLive-related packages that weren’t configured. I tried to remove the package using dpkg, which complained that texlive wasn’t an installed package. I then tried searching for the packages using apt-cache pkgnames tex, and ended up removing tex-commons. That got rid of the incorrectly configured packages and unblocked apt-get.

    After this, I ran apt-get update, then apt-get dist-upgrade. That installed a couple of additional packages. Then I ran apt-get autoremove to remove the obsolete packages. This, hopefully, completed the dist-upgrade. I also rebooted to make sure the system could still boot after that.

    OpenJDK 8 causing issues

    This HTPC is running a Minecraft world my friend and I are sharing. We log less and less often onto that map because my friend plays rarely and I am currently focusing on Agrarian Skies 2 rather than this old FTB Monster pack the map runs on. But I I am considering the possibility of starting a map on FTB Infinity Expert Skyblock pack after I’m done (or completely blocked) with Agrarian Skies 2 and would like to run it on a server with an auto-backup strategy in place and the possibility for friends to join in if they want. I thus wanted to keep the possibility of running Minecraft servers on my HTPC.

    Now, when I started the FTB Monster server, I was greeted with a meaningless ConcurrentModificationException. I may be able to retrieve the stack trace, but this is a bit pointless, referring repeatedly to non-sense internal class names. Ok, this is probably broken because of Java 8 and won’t get fixed unless I upgrade the mod pack, which will either force me to start from scratch on a new map, or require hours and hours of work to convert the map, and the map would be quite damaged after the upgrade. In particular, switch to Applied Energistics 2 mod will destroy my logistic network so much that it will require a complete redesign and rebuild. This will be even worse than the switch of Thermal Expansion and IC2 that occurred when I migrated (painfully) from Unleashed to Monster.

    Simple solution: run this under OpenJDK 7. That’s simple under Windows, unfortunately… Yep, no available OpenJDK 7 package on apt-get for Ubuntu 16.04! Maybe I could have fiddled something with PPAs or install Oracle’s JDK outside of the apt-get packaging system, but what’s the point of having a packaging system if it requires so many workarounds? I also thought about running the server into a Docker container constructed from an image proposing Java 7, but that’s a bit convoluted and could cause other issues. Who knows if the server will behave well when running in a Docker container? It will probably, but that remains to be tested.

    Fortunately, I figured out a way to patch the installation by adding a new JAR to the mods folder. The JAR comes from http://ftb.cursecdn.com/FTB2/maven/net/minecraftforge/lex/legacyjavafixer/1.0/legacyjavafixer-1.0.jar and was recommended by a forum post on http://support.feed-the-beast.com/t/cant-start-crashlanding-server-unable-to-launch-forgemodloader/6028. Installing the JAR fixed the issue and allowed me to start the server!

    Totally unexpected, very frustrating

    In order to test my Minecraft server, I started the FTB Launcher on my Ubuntu 16.04 main computer. From the launcher, I started the FTB Monster pack: crash. OpenJDK 8, again. I had to apply the JAR patch on my client as well. I did it (instead of fiddling to manually install JDK 7) and that worked. I was able to log on my server and enter my world. However, as soon as I pressed F12 to go full screen, screen went blank and everything was blocked. No way to go out of the game by switching desktop, no way to kill the game window with ALT-F4. I would once again have to go to another machine, SSH into my main computer, kill the JVM, fail, try with kill -9. Instead, I just rebooted the machine, tried with Windows, and that worked. My Minecraft setup was correct. Just the client now requires a different video card or driver to work reliably on Ubuntu, but I changed from onboard Intel HD to a NVIDIA GeForce addon card in 2013 just for that reason. Having to switch back and forth graphic cards from Ubuntu versions to versions is a total non-sense for me.

    Kodi is gone

    I don’t know exactly how that happened, but Kodi, the new name of XBMC, got removed during the upgrade. Just reinstalling it was simple and enough to fix this. Kodi still works fine, for music and video playback. ProjectM visualization is still broken, though, but that’s not a big deal. I didn’t hear the audio distortion since the upgrade, but it’s too recent to tell if it’s gone for good or not.

    Conclusion

    For now, I’m not sure it was worth it but at least it didn’t break things. Main functionalities of my HTPC are still there: Minecraft server runs, I was able to listen to YouTube videos, Kodi works for music and videos, SSH is  working properly. I’ll have to see if other surprises are awaiting me.

  • Taking control of his own machine

    Not being administrator on his own Windows-based PC or laptop is a real shame. It prevents the installation of most software programs and some settings are not accessible. This issue is most commonly caused by system administrators in a need for a power trip, but it could also happen on a home computer configured for multiple users. One could run on user accounts and sometimes, less and less often, switch to an administrator account to install software programs. The inevitable then happens: forgotten administrator password.

    The simplest solution in this case is to wipe the computer and reinstall Windows, but I needed to do better than this two years ago. This post describes what happened and what I did to get around the issue. Anyone trying this should be careful and be aware that this could cause trouble, especially if the gained privileges are misused afterwards. I only gained administrative privileges on a testing ultrabook. That couldn’t and didn’t grant me any permission on other systems.

    A new but limited ultrabook

    Friday, April 26 2013, I got a new Windows 8 ultrabook at my workplace. It was officially to test a Windows-based virtual assistant we were developing at that time, but that machine could do more: temporarily replace my official work laptop which was becoming too sluggish. Replacement of the old laptop was delayed for procedural reasons. I knew I could install my stuff on the ultrabook without disturbing the virtual assistant application, so the ultrabook could perform both functions.

    The Monday after, I was heading to the Burlington office of my company to provide technical support for people there. I wanted to bring that new ultrabook with me so I needed to install a couple of programs on it before leaving. Unfortunately, I quickly noticed, Friday at the end of the day or during the weekend, I don’t remember, that I couldn’t install JDK on the machine because I was not administrator. I wasn’t sure I would be able to get IT from granting me the administrative privileges by Monday just before leaving and wanted to get some stuff installed before Monday.

    Feeling a bit cow boy, I wanted to hack my way around this issue. Not being administrator on my corporate laptop is a concern for me. At my current workplace, this is not an issue, but I heard this is a problem in other companies. Having a last resort way out seemed useful to me. I just found out this way, and that leaves almost no traces if everything goes well. Keep in mind this impacts just the hacked computer, nothing else on the network.

    Shutting down Windows 8 properly

    The main idea of my strategy was to boot the ultrabook into Linux, mount the Windows partition and hack the registry to do something about the unknown administrator password. For this, Windows 8 has to be shutdown properly. There is a new feature called hybrid startup causing the shutdown to be unclean and preventing Linux to mount the Windows partition read-write. Fortunately, this can be worked around by cleanly shutting down the PC. The simplest way is to start a command prompt (Windows key + R, then cmd), and type shutdown /s /t 0. Two years ago, I also found out I could hold Shift key while clicking on the Shutdown button, but I’m not sure this works anymore.

    Booting Linux

    Then I needed to boot into Linux. The simplest solution is to use Offline NT Password Recovery & Registry editor, but it was not compatible with UEFI at that time and I wasn’t sure I would be able to perform a non-UEFI boot on this Dell’s XPS13 ultrabook.  Moreover, I cannot find the download anymore for the tool. It seems that we now have to email the author to get the hidden link. I find this quite bad practice and when that happens, have a tendency to look elsewhere.

    I thus tried to boot Ubuntu, and I had to do it from a USB key because there is no CD/DVD drive in the XPS13. I don’t remember exactly how I got the Live USB key. I probably used the Live CD/DVD/USB Creator tool built into Ubuntu, but other pages such as this one give clues about how to create it from Windows.

    I then had to modify the BIOS/UEFI settings of the ultrabook to alter boot priority. If I remember well, I had to hit F2 while the XPS13 boots, before Windows starts of course. I managed to get the ultrabook from UEFI boot the USB stick, but that crashed after the boot. I thus had to enable legacy boot and then boot the USB key in MBR, non-UEFI mode.

    chntpw

    After I successfully booted into Ubuntu Live USB, I started a terminal and entered sudo apt-get install chntpw. This installed the Offline NT Password Recovery tool. I just tested while writing this post on a Ubuntu 15.04 box and that still works!

    After the tool was installed, I of course started it: sudo chntpw. I followed the instructions. I was offered the opportunity to reset the administrator password, but I didn’t like this, because I would not be able to restore the ultrabook in its original state: my hack would leave a trace. I found a better option: active the hidden Administrator account! After this was done, I rebooted into Windows and was able to log in as Administrator.

    I don’t remember if I absolutely had to restore UEFI settings to disable legacy boot in order for Windows 8 to boot again, but I did it for my intervention to be as clean and traceless as possible. At worst, I would have obtained an error message when attempting to boot without the USB key and would have had to alter boot priority and/or disable legacy boot: no harm done to Windows.

    One step further

    The problem was solved, but I wanted to step even further: transfer the gained administrative privileges to my regular user account! For this, while logged in as the local Administrator, I had to access Control Panel, then Administrative settings, then Local users and groups. Unfortunately and very shockingly, this option has been completely hidden away in Windows 10: you once again have to search on Google and figure out you need to press the Windows + R keys to open the Run dialog, type lusrmgr.msc, and click/tap on OK. I hope one day Microsoft will understand this is very bad and frustrating practice that will make many power users, including me if I could, migrate to Mac OS X.

    I then selected Groups, double-clicked on Administrators and clicked Add to add a member. The system offered me a dialog box to type the user name to add, but Windows was unable to find my user name of the form <company name>\<user name>.

    I don’t know how I thought about it, but I figured out that Windows would need to access my company’s active directory service to resolve user names to IDs. Since I was at home, I needed to establish a VPN connection. I thus installed the Cisco VPN client on the ultrabook (I would need it anyway afterwards), then was able to add my user account to the local Administrators group. I don’t know exactly how I got the VPN client: maybe I had one copy lying around on my main computer for obscure reasons, maybe I turned on my main corporate laptop to download it, don’t remember. I was also able to hook up to the VPN from Ubuntu without a tool downloadable only from my company’s Intranet. But I got VPN and that worked.

    After I did that, I logged back as my regular user, was able to install JDK without any issue, then I went back into Local Users and Groups, selected Users, double-clicked on Administrator and disabled the account. That closed the back door I used to gain administrative privileges, without taking away my new rights.

    Will this always work?

    No. Unfortunately, I can imagine ways to prevent this trick from working. The easiest way is to set up a password preventing access to the BIOS settings. Not being able to modify BIOS settings means impossibility to alter boot priority. With that enforced, the only workaround would be to remove the SSD from the machine, install it in another computer running Ubuntu and run chntpw, making sure it would work on the SSD, not on a potential main Windows install in dual boot on the Ubuntu box! Removing a SSD from a laptop or ultrabook is sometimes a risky operation, sometimes requires disassembly of the keyboard, memory modules, casing, etc. Not sure I would have attempted it.

    Of course, the latter workaround miserably fails if the disk is encrypted, e.g., with Symantec’s PGP Whole Drive Encryption. One possible workaround may be to get the SSD out again, install it on a Ubuntu box itself running Symantec’s PGP and, if the encrypted drive’s password is known, maybe it is enough to decrypt the drive and mount it, allowing chntpw to work on it. It could also happen that the encryption key is made of the user’s password and a hash derived from computer’s information. In that case, it could be quite hard to work around the protection. One possibility, if the BIOS is not password-protected, may be to boot into a Live USB Ubuntu, install the encryption tool and try to decrypt the drive on the local computer itself.