• No results found

Modellierungswerkzeug für die Personalisierung von Bank- und PKI Smartcards

N/A
N/A
Protected

Academic year: 2021

Share "Modellierungswerkzeug für die Personalisierung von Bank- und PKI Smartcards"

Copied!
105
0
0

Loading.... (view fulltext now)

Full text

(1)

Modellierungswerkzeug für die

Personalisierung von Bank- und PKI

Smartcards

Diplomarbeit

Von

Sami Saadaoui

Betreuer: Prof. Dr. Jacques Calmet

Institut für Algorithmen und Kognitive Systeme

(2)
(3)

Einführung

Die Personalisierung

Die Personalisierung von Smartcards ist ein sehr wichtiger Prozess, der zur Produktion der Karte und zu ihrer Benutzung führt. In der Tat, es ist zu diesem Zeitpunkt, dass alle vom Kunden erforderlichen privaten Daten auf den Chip beladen und personalisiert werden. Dieser Vorgang ist unter dem Namen "elektrische Personalisierung" bekannt, und stellt das Thema dieser Arbeit dar.

Es gibt aber andere Prozesse, die während der Personalisierung durchgeführt werden sollen. Dies ist besonders der Fall für die graphischen Informationen, die auf der Karte gedruckt und die geheimen Daten die an den Kunden gesendet werden sollen.

Die elektrische Personalisierung

Die kritischste Etappe ist aber die elektrische Personalisierung, die den Inhalt und das Verhältnis der Smartcard festlegt. Außerdem, führt dieser Prozess zur Blockierung der verschieden Verwaltungsbefehle (APDU-Befehle), die, während der Personalisierung, bestimmte Betriebssystemdaten auf den Chip laden.

Im Fall eines internen Chipproblems, muss also die Karte, in den meisten Fällen, zerstört werden, weil der Fehler nicht mehr korrigiert werden kann. Solche Probleme müssen unbedingt vermieden werden, vor allem wenn

(4)

besonders der Fall für geheime Informationen wie die kryptographische Schlüssel und PINs, die immer verschlüsselt sein müssen. Die Verschlüsselungsmethoden basieren sich in diesem Fall auf das "keys Management" System, und müssen unbedingt, während der Personalisierung der Karte, berücksichtigt werden.

In der Praxis, erfolgt die elektrische Personalisierung durch bestimmte APDU-Befehle, die es erlauben, mit der Karte zu kommunizieren, und die benötigten Daten zu beladen.

Diese Befehle sind generiert durch die Interpretation eines

Personalisierungsskriptes, das alle notwendigen Operationen enthält.

Die Operationen stellen alle Sicherheits-, Betriebssystem- und Onlinebefehle dar, und sind gemäss der Syntax der Skriptsprache geschrieben.

Um den Inhalt des Chips zu erzeugen oder zu beladen, müssen also alle erforderlichen Daten zunächst modelliert werden. Danach dient das erhaltene Modell, als Ausgangspunkt, der Generation des Skriptes, das in der Produktion benutzt wird, um die an die Karte gesendeten APDU-Befehle zu erzeugen.

Modellierungsmethoden

Bestimmte Modellierungsmethoden werden bereits benutzt. Sie basieren sich aber auf nur einen Aspekt der Karte.

Somit modelliert das "Business"-Konzept die Karte gemäß der künftigen Benutzung. Dies ist besonders der Fall für die Bankkarten, die einem bestimmten Standard (wie EMV – Europay Mastercard Visa – ) folgen müssen. Die einzigen Karten, die in diesem Fall modelliert werden können, sind diejenigen, die diesen Typen entsprechen.

Trotz diesem Aspekt, stellt diese Methode ein schneller Prozess zur Personalisierung bestimmter Kartentypen dar.

Das "Befehle"-Konzept modelliert die Personalisierungslogik, indem man alle Betriebssystembefehle kombiniert um das Personalisierungsskript zu erzeugen.

Diese Methode erfordert eine umfassende Kenntnis der Betriebssystembefehle der Karte, und kann nur für die Typen von Karten angewendet werden, die ein ähnliches Betriebssystem haben. Sie wird also besonders für die "offenen" Karten benutzt, die sich auf dem Open Platform Standard basieren. Dieser Kartentyp wird besonders für PKI (Public Key Infrastrcture) Karten benutzt.

(5)

Schließlich stellt das "Gebilde"-Konzept das Karteisystem innerhalb der Karte dar (gemäss der ISO 7816-4 nom). Folglich wird es hauptsächlich an die "geschlossenen" Karten angepasst, d.h. deren Betriebssystem ist hundert Prozent Eigentümer.

Diese Methode ist unabhängig vom „Business“ der Karte und ergänzt also das erste Konzept. Sie ist aber nicht geeignet für Open Platform Karten, die das Karteisystem bereits verdecken und ein anderes „Gebilde“-Konzept darstellen.

Das Modellierungswerkzeug-Projekt

Die Hauptzielsetzung des Modellierungswerkzeugs ist, einen automatisierten Prozess für die Personalisierung der Bank- und PKI-Smartcards, zur Verfügung zu stellen. Dieser Prozess soll alle Personalisierungsschritte, von der Definition der Kundenanforderungen bis zur Generierung der Personalisierungssoftware, darstellen.

Das Werkzeug wird auch andere Funktionalitäten haben, wie z.B. die Erzeugung von Testkarten, und soll völlig erweiterbar sein.

In der Tat, das Ziel ist nicht, ein Karten- oder Plattformspezifisches Werkzeug einzuführen, sondern eine generische Methode, die nicht nur erweitert werden kann, um andere Karten oder „Business“ zu unterstützen, aber auch das System mit neuen Funktionalitäten zu erweitern.

Die erste Zielsetzung des Werkzeugs ist aber die Skripte zu erzeugen, die in der Produktion benutzt werden, um die Smartcards zu personalisieren, damit eine manuelle Erzeugung dieser Komponenten vermieden werden kann. Die technische Herausforderung jedoch, ist die Unterstützung alle Kartentypen damit die Implementierung eines neuen Systems, wenn es neue Anforderungen gibt (neues Betriebssystem...) vermieden werden kann.

Somit besteht das Ziel der Arbeit darin, ein Modell vorzuschlagen, das alle Aspekte der elektrischen Personalisierung berücksichtigt, und auf dieser Weise, jeden Kartentyp beschreiben kann (besonders PKI- und Bankkarten). Dieses Modell muss danach interpretiert werden um automatisch das Personalisierungsskript zu erzeugen. Diese Generierung, die abhängig vom

(6)

Die vorgeschlagene Lösung

Das vorgeschlagene Modell kann als eine hybride Lösung angesehen werden, die bestimmte Aspekte der existierenden Konzepte kombiniert und verbessert. Somit besteht die Idee darin, eine Modellierung der Gebilde der Karte (ohne dass, sie mit einem besonderen Aspekt der Karte verbunden wird) und der verschiedenen notwendigen Personalisierungsbefehle zu haben.

Das Personalisierungsmodell

Das vorgeschlagene "Gebilde"-Modell setzt sich aus drei wesentlichen Teilen zusammen: Struktur, Sicherheit und Quelle.

Der Strukturteil beschreibt die interne Hierarchie der Gebilde. Für die geschlossenen Karten stellt er also das Karteisystem dar. Er beschreibt ebenfalls den internen Zustand des Chips dank einem Feld "Action", der die Operation darstellt, die auf der Karte zum Zeitpunkt der Personalisierung durchgeführt werden muss.

Dieses Konzept erlaubt eine schnelle Navigation in der modellierten internen Struktur des Chips, und eine Identifikation des Gebildes, für das eine besondere Operation durchgeführt werden muss.

Das Zustandsmodell erlaubt auch die Durchführung der Modellierung an einem beliebigen Schritt des Kartenlebenszyklus. Somit, kann der Prozess auch nach der ersten Personalisierung durchgeführt werden. Das beste Beispiel ist die Änderung eines Applets auf einer Open Platform Karte. Mit dem Zustandsmodell kann der Zustand des alten Applets definiert werden, indem man das Feld "Action" auf "delete" setzt. Die Aktion für das neue Gebilde wird dann das "create". Während der Interpretation des Modells werden dann die beiden Befehle (für die Löschung des alten und die Erzeugung des neuen Applets) korrekt generiert.

Der Sicherheitsteil ("Security Domain"-Teil) stellt die verschiedenen kryptographischen Schlüssel dar, die bei der Authentisierung mit einem bestimmten Gebilde benutzt sind. In der Tat erfordern bestimmte Personalisierungsoperationen, eine Authentisierung zwischen der Karte (und besonders das ausgewählte Gebilde) und der Personalisierungsmaschine.

(7)

Dieser Teil wird denn, während der Generierung des Personanalisierungsskriptes benutzt, um all Authentisierungsbefehle zu erzeugen.

Die Zahl und die verschiedenen Parameters der kryptographischen Schlüssel sind nicht beschränkt, weil sie vom Betriebssystem der Karte abhängig sind. Somit sind zum Beispiel drei verschiedene Schlüssel bei der Definition eines "Security Domain" (gemäss der Open Platfrom standard) erforderlich. Dies ist nicht unbedingt der Fall für andere Karten.

Schließlich beschreibt der Quellenteil alle Parameter, die für die Generierung der Befehle notwendig sind: Die Art und Weise, wie bestimmte Angaben zum Beispiel berechnet oder verschlüsselt werden, der Hinweis auf das Betriebssystem der Karte und auf das "Business", sowie alle Informationen, die einem spezifischen Gebilde eigen sind.

Dieser Teil enthält also die wichtigsten Parameterinformationen und vor allem die "Quelle" eines bestimmten Gebildes. Dieses Konzept wurde eingeführt, damit alle Daten- und kryptographischen Operationen auch modelliert werden können.

Damit man die Bedeutung dieser Modellierung verstehen kann, müssen einige Aspekte des "Keys"- und "Data Management"-System eingeführt werden.

In der Tat ist ein kryptographischer Schlüssel zum Beispiel nie in "clear text" (unverschlüsselt) innerhalb des Netzes vorhanden. Er wird allgemein unter einem Hauptschlüssel der Sicherheitsmaschine (z.B. eine andere smartcard) verschlüsselt. Außerdem sind die Schlüssel in vielen Fällen in einer sicheren Datenbank vorhanden und müssen mit einem spezifischen Befehl des Schlüsselmanagementsystems zurückgeholt werden.

In einigen Fällen ist die einfache Zurückholung des Schlüssels nicht die einzige Operation, die an diesem Gebilde durchgeführt werden muss. Der Schlüssel könnte mit spezifischen Daten (benötigt vom Kunden, vorhanden innerhalb der Karte wie der Seriennummer...) oder sogar mit einem anderen Transportschlüssel (benötigt während des sicheren Personalisierungsprozess) abgeleitet werden.

Alle diese Operationen sind unabhängig vom Kartentyp, sie spielen aber eine große Rolle während der Personalisierung der Karte. Deshalb ist diese Modellierung auch wichtig damit alle Aspekte der elektrischen Personalisierung berücksichtigt werden können

Die oben eingeführte Grammatik wird mittels einer XML-Syntax definiert, was eine bessere Navigation im Modell erlaubt. Durch den XML Parser, sind ebenfalls die Kompilation- und die Datenextraktionsaspekte maskiert. Das XML-Konzept erlaubt auch eine Schnelle Verbindung zwischen diesen

(8)

Das "Befehle"-Modell stellt in der vorgeschlagenen Lösung eine Zwischenetappe zwischen dem so definierten Gebildemodell und dem erzeugten Skript dar. Außerdem wird das bestehende Konzept verbessert, um nicht nur die Betriebssystembefehle zu integrieren, aber ebenfalls alle bei der Personalisierung benutzten Operationen zu umfassen: kryptographische Befehle...

Diese Befehle haben eine enge Verbindung mit den verschiedenen "Quellen", die in der ersten Modellierung definiert wurden. In der Tat, sind zum Beispiel alle Operationen, die bei der Zurückholung eines Schlüssels notwendig sind, vorhanden und können auch modelliert werden.

Eine Verschlüsselungsoperation kann somit unabhängig von der Sicherheitsmaschine benutzt werden. Der zugehörige Befehl wird dann während der letzten Skriptgenerierung korrekt übersetzt.

In diesem Modell, wird jeder Befehl als eine „Blackbox“ angesehen. Er hat eine liste von Ein- und Ausgaben. Die variabeln, die im Personalisierungsskript benutzt werden, sind damit auch modelliert. Schließlich, hat jeder Befehl auch einen spezifischen Typ (abhängig vom Betriebssystem, Online Service...) der, während der Interpretation des Befehlemodells, die entsprechende Skriptbefehle ermöglicht.

Somit hat die automatische Generierung des Personalisierungsskriptes als Ausgangspunkt das "Gebilde"-Modell und produziert eine Sequenz von „High-level“-Befehle (gemäß dem Befehlemodells).

Diese Sequenz ist sicherlich vom Format des benutzten Skriptes abhängig. Sie kann aber in eine beliebige Sprache übersetzt werden, die diesem Format gehorcht.

Der zweite Teil der Arbeit erklärt also diese automatische Generierung für eine „High-level“-Sprache (wie Javascript), in der, die Benutzung von Variablen und Funktionen im erzeugten Skript möglich ist. Dies war nicht der Fall für APDU-Format ähnliche Sprachen, die eine Verwaltung des Speicherplatzes und der "Buffers" in der Personalisierungsmaschine erfordern.

Die automatische Skriptgenerierung

Das automatische Generierungssystem basiert sich hauptsächlich auf dem Teil des Modells, der den Zustand der Karte beschreibt. Somit wird für jedes Gebilde, dessen Feld "Action" eine bestimmte Operation darstellt, die Verwaltungsklasse, die dem Betriebssystem des Gebildes entspricht, beladen.

(9)

Dieses Modul übernimmt dann die Verwaltung des Authentisierungssystems sowie die verschiedenen Personalisierungsbefehle, die bei der Ausführung der Aktion benötigt werden.

Somit werden die befehle und alle benötigten Variablen zu vordefinierten Datenstrukturen hinzugefügt. Dies erlaubt zum Schluss eine logische Sequenz der Befehle, die im Skript interpretiert wird, sowie allen Variablen die auch benutzt werden müssen zu erhalten.

Der Hauptunterschied zwischen den "High-level"- (wie Javascript) sowie "low-level"-Skriptformate liegt in der Verwaltung der benutzten Variablen. Somit kann das Konzept erweitert werden, indem man nicht nur eine Variablenliste erhält, aber auch eine Modellierung der verschiedenen "Buffers" der Personalisierungsmaschine.

Diese Modellierung umfasst den Speicherplatz und die Persistenz der Daten. In der Tat, viele Informationen werden in verschiedenen APDU-Befehle verwendet. Sie müssen also erst gelöscht, wenn sie nicht mehr benutzt werden. Dieser Aspekt stellt ein zusätzliches Problem dar, und zeigt also den Vorteil von Javascript.

Das Generierungssystem ist also in unserem Fall sehr einfach, weil die einzige Modellierung, die gebraucht wird, umfasst die Befehlesequenz sowie die benutzten Variablen.

Alle diese Parameter sind von der Hauptklasse verwaltet, die diese Befehle und Variablen an den zugehörigen Datenstrukturen weiterleitet.

Die Sequenzlogik ist jedoch vom Betriebssystem der Karte oder von der Quelle eines bestimmten Gebildes abhängig. Deswegen sind zwei verschiedene Weiterleitungskonzepte eingeführt worden, damit man eine Betriebssystem- oder Datenverwaltung erhält.

Der betriebsystemspezifische Manager wird also für jedes Gebilde, dessen "Action" nicht "NONE" ist, im definierten Gebildemodell geladen. Dieser Modul ist dann für die Erzeugungslogik und für die Bedeutung der Gebildeparameter verantwortlich.

Außerdem verwaltet jeder Betriebssystemmodul auch die

Authentisierungseinheit, die für die Personalisierung benötigt wird. Für dieses Sicherheitsmodell sind drei Parameter notwendig: Das Gebilde, das vorgewählt wurde, der Authentisierungszustand und das "Security Domain". Der Modul kann die Vorwählerabhängigkeiten zurückholen, indem er das vorgewählte Gebilde mit dem neuen vergleicht. Nach der Vorwählereinheit muss eine Überprüfung auf dem Authentisierungszustand und auf dem

(10)

Da das Gebilde einen Hinweis auf seinem "Security Domain" im Gebildemodell hat, kann der Vergleich leicht erfolgen. Sobald alle diese Schritte durchgeführt sind, aktualisiert der betriebssystemspezifische Modul das Sicherheitsmodell (NB: Der Authentisierungsprozess ist eine Betriebssystemanforderung. Abhängig vom "Action"-Feld des Gebildes, kann der Prozess durchgeführt werden oder nicht).

Schließlich und abhängig vom "Action"-Feld des Gebildes, ist der Modul auch für die Verwaltung der Befehleabhängigkeiten (alle Befehle die benötigt werden, um die Aktion durchzuführen) verantwortlich.

Die Datenverwaltungsklassen entsprechen den unterschiedlichen Arten von Befehle, die im vierten Teil der Arbeit dargestellt werden.

Jeder Datenmanager ist für das Beheben der Quellabhängigkeiten eines spezifischen Gebildes verantwortlich.

Im Fall eines kryptographischen Schlüssels, wird die

Datenverwaltungsklasse gemäss der Quelle des Gebildes im Gebildemodell (festgelegter Wert, sichere Datenbank...) geladen. Dieser Modul erzeugt dann die Liste der Befehle, die benötigt werden, um den Schlüssel zurückzuholen und fügt sie der Befehlesequenz hinzu.

Schließlich erzeugt der spezifische Datenmanager eine temporäre Variable, die den abschließenden Behälter für den Schlüssel sein wird, und leitet sie an den Betriebssystemmodul weiter. So werden Datenabhängigkeiten vollständig von der Betriebssystemklasse maskiert. Die Datenvariable, die im Skript verwendet wird, ist dann einfach zurückgeholt.

Implementierung

Schließlich ist ein spezifisches Beispiel einer Verwaltungsklasse (die eine Open Platform Karte darstellt), implementiert worden, um das Modell und die automatische Generierung des Personalisierungsskriptes auf einem konkreten Beispiel vorzustellen.

Die Implementierung wurde in Java gemacht da man viele "Open source"- Projekte wiederverwenden kann. Für die verschiedenen XML-Operationen, wurde JDOM (Schnittstelle für die Arbeit mit XML-Dokumenten) benutzt. Die wichtigsten Klassen, die implementiert wurden, sind diejenigen, die die Sicherheitskonzepte der Open platfrom Karten verwalten. Die sind also verantwortlich für die Generierung der verschiedenen Befehle, die für die "Secure Channel"-Definition, die Signaturberechnung ("MACing Procedure") und die Verschlüsselung wichtig sind. Diese Konzepte sind definiert von dem

(11)

Open Platfrom-Standard und bestimmen die Art und Weise, wie die Daten zwischen der Karte un der Personalisierungsmachine ausgetauscht werden. Der Standard definiert also drei verschiedene Sicherheitskonzepte für den Datenaustausch:

- "CLEAR" wenn die Informationen überhaupt nicht verschlüsselt werden. - "MAC" wenn eine Datensignatur gebraucht wird.

- "ENCMAC" für die Verschlüsselung der Daten und die Signaturberechnung.

Das Sicherheitsniveau wird, während der Definition eines "Secure Channel"-Prozeß zwischen der Karte und der Machine festgelegt. Dieses Niveau ist dann für alle Befehle benutzt, die an die Karte, so lange der Sicherheitskanal noch gültig ist, gesendet werden.

Die Implementierung wurde also gemacht damit die Sicherheitskanal-Befehle erzeugt werden können. Die verschieden kryptographischen Operationen (Verschlüsselung und Signaturberechnung) können dann an irgendeinem Befehl angewendet werden.

Das Beispiel zeigt wie ein bestimmtes Open Platfrom-Gebilde ("Security Domain") erzeugt werden kann. Die verschiedenen Befehle stellen alle Authentisierungs- und betriebssystemsoperationen dar, die für die Erzeugung des Gebildes notwendig sind.

Zusammenfassung und Ausblick

Somit hat die Arbeit gezeigt, dass es möglich ist, ein generisches Modell zu benutzen, das alle Aspekte darstellt, die mit der elektrischen Personalisierung einer Smartcard verbunden sind.

Dieses Modell ist ebenfalls dank seinem Format (XML) leicht interpretierbar, was die Einführung eines automatischen Generierungsvorgangs der

(12)

Personalisierungsprozeß gebraucht werden. In der Tat wird jede elektrische Personalisierung mit einer obligatorischen Kontrollphase gefolgt.

Während diesem Prozeß wird auch ein Skript erzeugt und benutzt, um das Ergebnis der Befehle, die an die Karte während der Personalisierung gesendet wurden, zu überprüfen.

Der folgende Schritt im Projekt wird dann die automatische Erzeugung des Kontrollskriptes sein. So würde das vollständige Softwarepaket (Personalisierung und Kontrolle) automatisch erzeugt. Um dieses Ziel zu erreichen, könnte die gleiche Erzeugungslogik, die während der Personalisierung definiert wurde, wiederverwendet werden.

In der Tat, während der Durchführung des Personalisierungsskriptes, werden die erforderten Daten auf den Chip geschrieben. Zur Kontrolle ist der durchgeführte Prozeß umgekehrt: die Befehle, die an die Karte gesendet werden, sind benutzt um bestimmte Daten zurückzubekommen. Diese Informationen werden dann mit den erforderlichen Daten verglichen. In beiden Fällen (schreiben/lesen), ist die Sequenzlogik fast dieselbe. Der definierte automatische Personalisierungsprozeß könnte zur Kontrolle wiederverwendet werden.

So könnte das definierte Modell der Ausgangspunkt für die Personalisierung und die Kontrolle sein, und das zeigt die Flexibilität der neuen Lösung.

(13)
(14)

Modelling Tool for Banking and PKI

Smart Card Personalization

Master Thesis

By

Sami Saadaoui

Advisor: Prof. Dr. Jacques Calmet

Institut für Algorithmen und Kognitive Systeme

(15)

Declaration:

I hereby declare this thesis to be the result of my own work. All sources and literature used are correctly cited.

(16)

Acknowledgements

I would like first of all to thank Prof. Dr. Jacques Calmet, Mr Jean Louis Roch and Mr William Delambre, who gave me the opportunity to do my master thesis on a very interesting subject, proposed by the university of Karlsruhe, the ENSIMAG and the Axalto Company.

This work would not have been possible without the help of the “Personalization” team members at Axalto, and especially Frederic Gaubert, who was always there to answer my questions and to give me the needed advises to pursue my study.

I also dedicate this work to my family and my friends, who encouraged me during these six months of stress and intensive work, and who made all my achievements possible.

(17)

Table of Contents

1 Introduction ...21

2 State of the art...25

2.1 The personalization Process ...25

2.2 The electrical personalization...26

2.2.1 Personalization of native cards ...27

2.2.2 Personalization of open platform cards...28

2.3 Constraints related to Banking and PKI cards...29

2.4 Personalization modelling solutions...31

2.4.1 Commands Modelling...31

2.4.2 Business Modelling...33

2.4.3 Entities Modelling...35

2.4.4 Global platform ...36

3 The Modelling Tool Project...38

3.1 Project objectives...38

3.2 System overview ...39

3.3 Study objectives...43

4 Model Proposed ...45

4.1 Entities Modelling...46

4.1.1 Description of the Application Profile ...46

4.1.1.1 The “Data-Structure” part ...47

4.1.1.2 The “Data-Source” part...52

4.1.1.3 The “Security domains” part...56

4.1.1.4 Link between the different application profile parts...57

4.1.2 Advantages of the application profile definition ...60

4.2 High-level Commands Modelling...61

4.2.1 Commands types ...62

4.2.2 High-level command concept ...63

4.3 Summary ...64

5 Automatic process...66

5.1 Interpretation of the application profile ...66

5.2 Possible scripting language formats...68

5.2.1 Low Level scripting language (Assembler-Like) ...68

5.2.2 High-level scripting language (JavaScript) ...70

5.3 Automatic script generation...72

5.3.1 The Macros Script ...72

5.3.1.1 General concept...72

(18)

6 Implementation ...79

6.1 Choice of the programming language ...79

6.2 OS aspects and design concept ...80

6.2.1 Packages overview...80

6.2.2 Class Overview...81

6.3 Example: creation of a security domain...85

7 Conclusion and further work ...91

8 Bibliography ...94

9 Glossary ...96

10 Annexes...97

10.1 The Smart card technology...97

10.1.1 Technological Evolution of smart cards ...98

10.1.2 New Smart card standards ...98

10.1.2.1 Open Platform ...98

(19)

List of figures

Figure 1: Electrical personalization of the card ...26

Figure 2: Industrial process for electrical card's personalization...27

Figure 3: Modelling with a commands approach ...31

Figure 4: Modelling with a business approach...33

Figure 5: Modelling with an entities approach...35

Figure 6: Draft of the Suite subprojects...40

Figure 7: Framework approach for the conception of the suite ...41

Figure 8: Overview on the new personalization model ...45

Figure 9: Sample file system...48

Figure 10: Communication between the different application profile parts....58

Figure 11: Derivation data used during the secure channel initialisation...63

Figure 12: High-level commands concept ...63

Figure 13: Application profile overview...66

Figure 14: Class Loading process and plug in concept ...67

Figure 15: Script generation mechanism ...72

Figure 16: Macros Script components...73

Figure 17: Macros Script Manager delegation schema...74

Figure 18: Operating system delegation mechanism...75

Figure 19: Data delegation mechanism...76

Figure 20: Packages overview ...80

Figure 21: Security Process Classes ...81

Figure 22: MACing Procedure ...83

Figure 23: State of the macros script...84

Figure 24: Architectural overview of a Microprocessor smart card ...97

Figure 25: Open platform card architecture...99

(20)
(21)

1 Introduction

Nowadays, the smart cards are especially known for their application fields and all their technological aspects. Thus, many people are for instance interested in the new cryptographic capabilities or in the memory extension inside the chip. All these aspects are research oriented and interest the computer science community.

Another aspect however, still remains industry oriented. This aspect concerns the production cycle of the card.

Indeed, the smart card has to be issued and then personalized according to the customer requirements. The steps involved in this process are very “owner

specific” and represent the “business” of each smart card company.

The “Personalization” is perhaps one of the most critical industrial processes during this card production cycle.

During the personalization, all the data needed (applications, customer data…) is loaded inside the card. This step is called “electrical personalization” of the card. The other steps involved concern the graphical aspects of the card body, the magnetic stripe encoding and the mailing procedure [AxPerso].

All these steps are very critical since they result in the production of millions of cards and that’s why many constraints have to be taken into consideration during the personalization of the card.

First of all, all the data loaded on the chip, must be available within a secure environment, which provides its confidentiality and integrity. This is typically the case for banking cards because they contain very sensitive data. And during the personalization process, additional security mechanisms are necessary to encrypt this data, in order to ensure its transport between the host and the smart card [AxPerso].

(22)

Moreover, the different processes involved in the personalization of the card chip must be one hundred per cent reliable. This aspect is very important especially during the electrical personalization of the card. Indeed, once the card is personalized, the administrative commands of the operating system are blocked, which makes impossible any further modification in the chip content [AxPerso]. In other words, if there is an error during the electrical personalization of the card, the chip is not usable any more and it has to be destroyed. In the case of big contracts (millions of cards), this would represent a financial disaster for the company.

Thus, the electrical personalization is considered as one of the most important and critical aspects of the card issuance. All the steps involved in this process have to be correctly controlled and the error rate should be minimized. These goals cannot be reached if there is no automatic process to generate the software packages that are used to personalize and control the cards.

That’s why, the main purpose of the study would be to understand all the technical aspects related to the electrical personalization and to propose a model describing the needed information for this process. The model has to be as generic as possible in order to support PKI, banking or even other types of cards.

This personalization model should then be the starting point for the automatic generation of the different components used to personalize the card.

The aim of this document is to present a possible automatic process for the generation of the personalization script, which contains the different commands, and is used to personalize the card. This process is based on the model defined in the first part of the thesis.

Thus, the following study will include a first part (section two) presenting the electrical personalization and the existing modelling solutions. The aim of the “state of the art” is the introduction of the main concepts involved in the modelling of the personalization process.

For a better understanding of these concepts, an overview on the smart card technology is provided as annex.

The third section gives an overview on the context of work. The aim is to present the main objectives of the modelling tool project, and to identify the points that have to be studied in details.

Indeed, the complexity of the problem requires the explanation of the technological aspects of the modelling tool and the identification of its challenging facets. The kernel of the tool would be then the starting point for the detailed study.

The next three sections present the solution proposed: the model describing the personalization data, the automatic process allowing the interpretation of this model and the script generation, and an implementation example.

(23)

All these parts are a proof of concept for the generic modelling and the automatic personalization script generation.

Finally, it is important to note that this study combines the identification of a new personalization-modelling concept with the specification of a software component, involved in a strategic and industrial project.

(24)
(25)

2 State of the art

2.1 The personalization Process

The personalization is a process where a Smart Card receives all general and individual information necessary to be issued on the field. This process involves many steps [AxPerso]:

Chip Programming

In this step the internal architecture of the card is created (File architecture in terms of native cards), the needed applications or applets (for java cards) are loaded and also personalized.

The security parameters are defined (PIN, cryptographic keys for encryption, authentication…).

Moreover, all data related to the cardholder is created inside the chip (Address, Social Security NO, Account NO, Phone NO…).

Graphical Personalization

Each card has specific information printed on its plastic body. Thus, this step includes the embossing mechanism and the printing procedure (name, subscriber NO, photo…), which can be thermal, with laser or Ink-jet.

Magnetic Stripe Encoding

For cards with a magnetic stripe, an encoding procedure is needed in order to provide this stripe with the information required by the customer.

Fulfilment

This final step is the last procedure which will provide the cardholder with his new card: This latter is inserted on a mailing paper and sent to its new holder. The cardholder receives also the PIN (or other security mechanisms needed by him to use his card).

(26)

• Data Processing/Preparation:

In this step the data sent by the customer (information about the batch, encryption key references…) is checked and processed. As a result of this phase, we obtain files requested by the customer or used as internal process reports, and “image” files containing the personalization data and used in production.

• Loading software:

In this phase the generic and individual data needed for the personalization is loaded on the production equipment (Personalization Machine)

• Quality control:

The quality concerns not only the internal state of the chip but also the graphical personalization. Thus, this step includes an EEPROM check, and a card body content one.

2.2 The electrical personalization

The electrical personalization or chip programming will be the main subject of the study. Indeed, it represents the main technical challenge of the whole personalization process since all the internal chip data is created during this phase.

Chip

Application PIN

Inside the chip Chip programming

Chip

Figure 1: Electrical personalization of the card

The electrical personalization must be seen as an “industrial” process, where specific software components are developed and loaded on a personalization

(27)

machine, in order to customize a special card type (according to the requirements of the customer).

The software components are composed usually of two main parts: The customer-oriented programs (special data processing), and the scripts that are interpreted by the personalization machine readers, in order to generate the personalization commands sent to the card.

These personalization commands are low-level APDU commands allowing the communication with the operating system of the card.

Data

Personalization machine

Chip Personalization

Commands

Figure 2: Industrial process for electrical card's personalization

The customer-oriented programs cannot be “modelled” since every customer has his own requirements and file formats. The scripts however, could be generated automatically since they depend on the card type and on the interpreter used for the APDU commands generation.

Moreover, these scripts include also the data recovery process and are very proprietary since the card management system differs from one company to another.

Thus, this personalization procedure may not be the same in other smart card companies, and it is difficult to have precise details on the methods used by them. Indeed, the personalization is the first step of the industrialization, and the “core business” of the companies relies on the performance and the speed of this process.

(28)

The “native” cards are microprocessor cards with a proprietary operating system. That means that there is no cross-industrial possibility of personalizing such cards. Every provider knows the OS commands needed for the communication with the card and the personalization is, in this case, owner specific.

Nevertheless, these cards have one similarity: the ISO 7816-4 norm, which defines the internal file system of the smart card [ISO78].

Thus, the card content can be modelled using the file entities defined by the ISO norm (MF, DF, EF). This model however, must be interpreted differently according to the operating system of the card, since the APDU commands for the creation of the files are totally different.

Moreover, the whole security system is, in most cases, customer specific. This aspect does not facilitate the creation of a generic approach for the personalization of native cards.

2.2.2 Personalization of open platform cards

OP cards have been introduced in the smart cards market in order to provide a common solution for applications development. Thus, all OP cards have a set of common concepts (operating system, security…) that allow the interoperability between the different smart card technology actors [OPCS]. The main advantage of the OP cards is that the file system is hidden by other high-level entities, performing a set of administrative commands.

In terms of personalization, this implicates the customisation of these open platform entities by respecting the set of security concepts defined by the norm.

Thus, in this case, the modelling is easier since the cards have more similarities and the personalization can be standardized [OPCPG]. This aspect however, does not take into consideration the proprietary commands and concepts that may vary from one card to another.

(29)

2.3 Constraints related to Banking and PKI cards

o Banking cards

The financial field is nowadays one of the most important smart cards markets. Every bank customer has at least a magnetic stripe card and the microprocessor cards start to be widely used [AxPerso].

The first cards however, were native ones since the banking customers preferred owner specific operating systems to common platform cards. Moreover, before the introduction of the open platform concept by VISA [OPCS], it was one of the security concepts of banking cards: “hide to protect”.

With the open platform, the security was improved but the “open” concept still frightens those who prefer closed and secret operating systems. In addition, the OP cards represent a higher cost due to their technological revolutions (memory, security…) and that’s why, the native cards are widely used in the banking sector [AxPerso].

Thus, one of the main constraints related to banking cards would be to take this aspect into consideration, and to adapt the modelling to OP and native cards.

Another aspect concerns the “business” part of a specific market segment. For the banking field, this constraint is for instance represented by EMV (Europay MasterCard and Visa).

The EMV provides a global specification for cards, terminals and acquiring system in order to ensure correct operation and interoperability during a credit or debit transaction[EMV].

EMV defines a set of on-card data elements with fixed characteristics in order to provide this interoperability [EMV]. All these characteristics represent a personalization constraint since they have to be created on the card, with the right parameters.

Thus, the two main constraints of banking cards, which have to be taken into consideration during the modelling process, are: the card type (native or OP) and the business supported by the smart card.

(30)

o PKI Cards

PKI cards do not have the same constraints as banking ones since the application field is completely different.

Indeed, the access sector does not have a special norm that defines the different data elements on a PKI card i.e. there is no common set of specifications (like in the case of EMV) defining common rules for this type of cards.

The first PKI cards (before the introduction of the OP concept) were also owner specific. Each smart card operating system had its own commands for the creation and management of the different PKI entities (certificates, public and private keys…) [PKI].

Nowadays, the PKI cards that are commonly used are open platform ones [AxPerso]. Indeed, since the OP concept permits the development of applications that can be loaded on the card, many PKI application providers prefer to develop their access applet and to load it on an OP card.

Thus, PKI cards do not follow a specific business that influences the personalization process, and the dominant card type in this case is the open platform one.

(31)

2.4 Personalization modelling solutions

This section will introduce some existing modelling solutions used in order to generate the personalization script. These solutions focus on one specific card aspect (card type or business) and are thus applicable only in this specific field.

Despite this constraint, the modelling solutions are considered to be a big step towards the automatic personalization process. Indeed, the first personalization approaches were customer specific i.e. according to the requirements (card operating system, business…), the whole personalization package was developed in order to create one specific chip content.

The three main modelling solutions are the “commands modelling” used in the case of a common set of OS commands (like in the open platform case), the “business modelling” used to create chip contents according to a specific business (like EMV for banking cards), and the “entities modelling” describing the card content according to the ISO 7816-4 norm.

The final solution, is a new personalization concept introduced by Global Platform [GP] and could be seen as a sort of modelling approach to facilitates the customization of open platform cards.

2.4.1 Commands Modelling

Commands Approach

List of high level

Commands Interpreter

Script with low level commands Parameters set by the Personalizer "Pseudo" automatic generation

(32)

The new approach, used for the generation of the script, is based on the modelling of the different commands required for the personalization of the card. These commands can be either operating system or security device (the device used for the cryptographic operations) oriented [AxPerso].

The modelling approach consists then in regrouping some low level operations in a high level way.

Example: (based on OP cards)

The secure channel established between the card and the server needs two APDU card commands (INITIALIZE UPDATE and EXTERNAL AUTHENTICATE) [OPCS]. These two operations are regrouped, with all needed cryptographic operations, in a high-level macro used for the definition of the secure channel process.

The same thing is done for other complex operations like the loading or the installation of an application on an open platform card. Regarding the simple commands (like SELECT or GET STATUS) [OPCS], the model is more unitary and the high-level macro defined is the low-level command itself.

Thus, this macros-oriented approach defines an intermediary level between the person who wants to personalize the card on one hand, and the low-level script commands on the other hand.

To reach the low level script, a simple interpreter is provided in order to “translate” the macro with all its parameters.

Advantages of the approach

The most important advantage of this approach is that the low-level script remains hidden to the user.

Indeed, all the parameters needed to perform a specific operation are entered using predefined wizards. These parameters are then involved in the low-level script generation.

This modelling is also easy to use for someone who knows the “logic” of the card. That means that the concept offers a professional approach for the configuration and the generation of the personalization script.

Finally, the commands approach offers a flexible way of manipulating the personalization script by changing or adding some operations according to some new requirements.

(33)

Despite the flexibility of the commands approach, it has also a major drawback. Indeed, the concept is applicable to a special card type with a common set of operating system commands. This is typically the case for the open platform cards. Thus, the method is especially used for OP cards because the security and the operating system constraints can be easily modelled.

For native cards for instance, the concept is not so realistic since the APDU commands are totally different. It would implicate the development of a new tool for each new operating system.

Finally, the approach requires a high-level of OS knowledge that may be a blocking point for someone who wants a customer-oriented personalization. This is typically the case for technical supports and developers who do not control all the operating system aspects.

2.4.2 Business Modelling

Business Approach

List of Business

Entities Interpreter

Script with low level commands Entered by the Personalizer "Pseudo" Automatic generation

Figure 4: Modelling with a business approach

The business modelling is an approach that is especially used with banking cards. This concept is for instance based on the EMV standard, and used in order to create and personalize the EMV specific entities on the card. This includes the definition of the data elements with all their parameters according to the standard [AxPerso].

(34)

The business modelling is a useful approach for the card personalization according to a specific norm. In the case of EMV, this offers the best solution for the creation of the needed data elements that are defined by the standard. Moreover, the concept is easy to use for someone who knows the standard without having advanced knowledge in the card operating system. This aspect creates an abstraction level between the business and the OS constraints.

Thus, the business modelling represents a fast procedure for the personalization of a specific card type, which should be adapted to this specific business.

Drawbacks of the approach

The most important drawback of the business modelling is in the extensibility of the concept. Indeed, since the design is made for a specific business, any tool based on this business concept, cannot support other personalization processes.

In the case of EMV, it means that the cards supported by the business-modelling tool, are banking cards based on the EMV standard. Thus, the tool cannot be reused for other card or business types.

Another drawback is related to the internal file architecture of the card. Indeed, let’s assume that the customer needs another specific element in the card, which is not defined by EMV. In this case, the creation of the specific element must be hard coded, or another modelling concept (in this case architectural) has to be associated to the business model.

Thus, the business modelling cannot be applied to all card operating systems (especially for native cards) since the requirements are, in most cases, also OS-oriented.

(35)

2.4.3 Entities Modelling

The entities modelling is a design approach used in order to describe the file system of a card according to the ISO 7816-4 norm.

The model in this case, could be seen as an architectural view describing the chip content [AxPerso].

Entities

Approach Interpreter

Script with low level commands Made by the Personalizer Automatic generation Architectural view

Figure 5: Modelling with an entities approach

The model is then interpreted according to the operating system of the card in order to generate the correct low-level commands.

Advantages of the approach

The entities modelling is a complementary method to the business approach. Indeed, the model in this case is not restricted to a special business, since any entity described in the ISO 7816-4 norm can be created and modified.

Moreover, this offers a flexible way of describing the chip content without caring about the operating system commands. This aspect makes the approach easy to use for a non OS-specialist.

Drawbacks of the approach

The drawback of the entities model is in the extensibility of the concept. Indeed, since the approach is based on the card file system, it cannot be

(36)

2.4.4 Global platform

In the global platform approach, an OP entity (application, card, key…) is described using an XML file (also called profile) that contains all the parameters of the entity [GPP].

This profile contains also a script fragment (in JavaScript) used in order to perform the needed actions on the card [GPS].

In other words, during the personalization, the step involved is the interpretation of the script, which has been already written and customized by the entity provider [GPC].

This concept has been introduced in order to standardize and increase the interoperability between the different entity providers. That’s why, it cannot be seen as a modelling solution, but as a personalization approach that could have a serious impact on the modelling solutions in the future. The special aspects that have to be taken into consideration are the format of the profiles (XML) and the high-level scripting language (JavaScript).

(37)
(38)

3 The Modelling Tool Project

3.1 Project objectives

The main objective of the modelling tool is to provide an automated process for the personalization of access and banking cards. This process will include all personalization steps from customer requirements definition, till scripts generation. This suite has also to provide other functionalities in the future like sample cards generation, and to be fully extensible [KRS].

Indeed, the goal is not to implement a card or platform specific tool, but a generic suite, which could be extended not only to support other cards or businesses, but also new functionalities in the system.

The first objective of the suite is to generate the scripts used in production to personalize the smart cards, in order to avoid the manual generation of these files.

The technical challenge however, is to be able to support all kinds of cards and to avoid the actual situation, which consists of developing new applications each time there is a new operating system available, or each time the customer requirements are too “specific”.

In addition, even if the primary goal is to make the personalization developer’s work easier and more interesting, another dimension to the personalization process had to be taken into consideration. This aspect is “Marketing” oriented.

Indeed, other personalization services like the sample cards generation via the web should facilitate the interactions between the customer, the technical support and the personalization technologies available. These services should be easy to develop and to integrate in future versions of the tool.

Finally, this new tool should integrate new smart card standards (like global platform for instance) by supporting at least the XML profiles. Moreover, since the personalization methods are currently improved (like the actual

(39)

work on a new approach based on “JavaScript” to replace the assembler-like scripting languages), the implementation of the suite should offer the

possibility of adding new features in order to be fully extensible.

The first version of the tool should however be a prototype one. The aim is not to implement all the functionalities described in the previous paragraphs, but to identify the kernel of the tool in order to study the challenging components involved in the modelling and the automatic process.

3.2 System overview

The aim of the diagrams defined below, is not to impose a specific system-architecture but to give a general overview on the different subprojects, and to define the dependencies between these modules.

This definition would help in the strategic management of the project.

Indeed, the detailed study of all the requirements confirmed the fact that the suite would be a very complex system to implement and that a progressive approach had to be defined [DSA]. The aim of the following diagram is to define this approach by splitting the system according to the different needs identified in the use cases document [UC].

• Business view, Personalization view, application view:

These views represent the graphical user interfaces of the suite. These modules are of course important for the presentation of a prototype, but do not really affect the internal functionalities of the suite. In other words, the views are responsible for representing information and do not contribute to the intelligence of the tool.

• “General Needs”-Module

This module contains all the functionalities related to a powerful and professional system. This includes files management (transport for instance if the web aspect is integrated, storage…), reports manager (generation of business reports, customer approval forms…) and security manager (access conditions, log in process…). This module deals with important aspects of the project, which could however be, treated in future versions of the suite. • Sample Card generation

(40)

module however, depends on the components generating the script that would be used to make the sample cards.

• Online services

This module is only used for the sample cards generation and can be treated with a low priority.

• Kernel (Card Manager, Macros Manager, Script Generator…)

This component is the most important part of the Suite as far as the internal functionalities are concerned. Indeed, It contains all the intelligent and processing modules dealing with OS and business restrictions, macros logic and script generation.

Thus, the kernel has to be studied in details before any other component of the suite [DSA].

Application view Business

view Sample Card

Generation Online Services General Needs Profiles Manager Security Manager Report Manager Personalization view

Card Manager / Macros Manager / Script

Generator

This arrow means that destination module is cumpolsory for the functionnality of the starting module

This arrow means that destination module is important to complete the functionnality of the starting module This arrow means that destination module is an optional functionnality provided for the starting module

(41)

The previous diagram gives an idea on the dependencies between the different components inside the suite. This approach however, does not give a real system overview explaining the high-level architecture of the suite.

The aim of the following diagram is to define this high-level architectural view. The main objective of course is the definition of a fully extensible architecture, which would really facilitate the integration of new operating system and business components in the suite, or even new functionalities.

Personalization Framework Visualization Framework

Perso developer view

Technical

support view Business view

Macros Manager OS Manager Script Generator Data Model Business Manager Model Plug in (new Card, new Application, new Business...) New Functional Plug in (ex: Data Processing) Online Services Profiles Manager Security Manager

(42)

First of all, it is important to explain why the term framework was chosen in the system overview.

The kernel component is perhaps the most important one but it has to be extensible for any new operating system or business. In other words, compiling the kernel of the suite, each time there is a new requirement, is not acceptable.

That is why the main challenge in the design part would be the definition of a framework-based architecture with an easy plug in system:

o The visual framework:

This framework defines the common set of functionalities for the different views of the suite (personalization developer, technical support, customer…). This set includes the communication process with the personalization framework (or kernel).

o The personalization framework (Kernel of the suite)

The kernel should also provide a common set of functionalities available for the internal process mechanism inside the tool.

There are for instance three different mechanisms at least, inside the kernel, that must be studied and for which technological solutions have to be found. These mechanisms are: The management of the XML profiles (transformation, modification, update…), the generation of the scripts and the validation of the profile according to the specific OS and business rules.

That means that each component is specialized by business and OS but that a common design concept has to be implemented.

o Plug in system

The Suite has at least to support three different plug-in mechanisms:

o Support new operating systems and new businesses (add new specific wizards in the visual part, add the new set of rules in the profile validation or the script generation)

o Easily add new functionalities inside the suite (web service for the online sample card generation, support other security mechanisms…)

o Add new concepts inside the suite (like the data processing view and the related mechanism inside the kernel)

(43)

3.3 Study objectives

The main objective of the study is to concentrate the work on the kernel component of the modelling tool, by focusing on three aspects:

o The model that describes the personalization data i.e. the design approach that should be used and that could support the different personalization requirements (operating systems, businesses…).

Moreover, the model should be as generic as possible in order to fulfil the requirements of any type of smart card (especially PKI and banking), and to be adaptable to any personalization environment.

Finally, since this model is considered as the starting point for the automatic script generation, it has also to include all the information needed to correctly generate the different commands involved in the personalization of the card.

o The automatic script generation i.e. the interpretation of the defined model in order to automatically generate the personalization script. The method should also be generic and adaptable to any scripting language. The conception and implementation however, are to be done for a high-level scripting language like JavaScript. Indeed, one of the project requirements is to test the new scripting languages in term of modelling and performance during the execution of the script.

o The implementation of an example based on the open platform concept i.e. the implementation of an OP operating system specific component that should follow the defined model and the automatic process. In addition, conception and implementation strategies have to be found in order to make the component reusable by other OP OS modules. This part is

(44)
(45)

4 Model Proposed

The new proposed model could be seen as a “hybrid” approach, combining all advantages of the different existing models. Indeed, as shown in the figure below, the card entities approach is kept but extended in order to support all types of cards, and the macros strategy is in this case an intermediary state between the model and the low-level commands sent to the card during the personalization.

All the modules involved in this design strategy are explained in this section. The aim is to present their characteristics and the advantages of the new approach compared to the old models. Indeed, even if the new approach seems to be a mixture between the existing procedures, many new features are added in order to make it more powerful.

Entity Approach List of high level

Commands Interpreter

Script with low level commands Made by the Personalizer Automatic generation Entities Modelling Interpreter

(46)

4.1 Entities Modelling

This part is considered as the starting point for the whole personalization process. Moreover, it is the unique step in which the user is one hundred per cent involved. Indeed, it includes the choice of the card and the business and the definition of the different parameters according to the requirements of the customer.

The result of this modelling part is a document describing the different card entities and containing the parameters used in order to generate the personalization script.

In the following parts, this document would be called the “Application

Profile”.

4.1.1 Description of the Application Profile

The application profile grammar should take into consideration the characteristics of any type of cards. Indeed, the native cards are “architecture” oriented and the open platform ones mask this feature. Moreover, even if the two types have different authentication and security concepts, the design should include a high-level security approach that would be inherited and extended by native and OP cards. In addition, each entity described in the profile should contain all the information necessary to the personalization. That means that the operating system information, the business parameters and the way the entity is computed or retrieved from the secure network should be available within the profile.

Thus, the application profile contains three main parts: The “Data-Structure” describing the file system or the entity hierarchy within the card, the “Security-Domains” containing the information necessary to the authentication (cryptographic keys) and the “Data-Source“ describing all the parameters of the entities.

Another important point is the format of the application profile. Indeed, if the defined grammar is used to generate text files representing the profile, a compile mechanism should be provided in order to parse the document, check its validity (in terms of syntax) and extract the needed information. This would add some complexity to the system and limit the extensibility of the approach.

(47)

That’s why, an XML approach would be appropriate for the format of the profile. In this case the information retrieval mechanism is hidden by the XML parser that checks the syntax of the document, and offers an interface to navigate within the profile and extract all the needed information for the personalization.

In addition the XML concept [Xw3c] already offers an architectural approach that would be useful to split the profile in the three data parts, and also used in the “Data-Structure”.

Finally, the XML approach would ensure the exchange and storage of the application profiles on different platforms. Indeed, the XML format is not related to a specific environment, and many open source projects are providing new tools and libraries improving the use of XML [AXP].

4.1.1.1 The “Data-Structure” part 4.1.1.1.1 Architectural concept

This part of the application profile is first of all very important for native cards. Indeed, the file system of the card can be represented within this section using the properties of XML:

Example:

<Entity type = “Root”>

<Entity type = “DF” ID = “101”>

<Entity type = “EF” ID = “301”/> <Entity type = “EF” ID = “302”/> </Entity>

</Entity>

This sample profile extract can be seen as the following file system within the card:

(48)

Root

DF

EF EF

Elementary File, Identifier = 301 Elementary File, Identifier = 302 Directory, Identifier = 101

Figure 9: Sample file system

Thus, the XML concept appears to be the most natural way for describing such structures. And even if this aspect is not a particular requirement for open platform cards (in this case the file system is hidden by the card and the card entities are on the same level of hierarchy), it would be considered as a high level concept usable for any type of cards.

Indeed, since the objective is to find a generic and extensible model for the card description, it is better to have as much information as needed, even if it is not one hundred percent used by all types of cards.

Moreover, this modelling part is not only useful for structure description but is also used during the script generation to easily retrieve the commands logic.

Indeed, one of the commands dependencies is the selection mechanism that must be hidden from the user: In order to communicate with a specific on-card entity, it must first be selected [OPCS]. For native cards, this selection might include the selection of the parent entity.

Thus, using an XML interface to retrieve each card entity described in the profile would provide an easy access to the parent, which allows the implementation of a generic algorithm for entities selection.

Example: Let’s assume that one of the requirements of the card operating system is the selection of each parent of the entity starting from the root. The data-structure facilitates the definition of a recursive algorithm for the selection mechanism:

(49)

/*High-level function describing the selection mechanism*/ - Function SelectAll (Entity toSelect)

If toSelect ≠ Root

Then SelectAll (toSelect.getParent ()) Select (toSelect)

This descriptive example shows that the definition of a data-structure within the profile, allowing the navigation throw the entities hierarchy, is a very important aspect of the modelling.

Indeed, it would be used to access the architectural information of the card model and improve the dynamic generation of the commands involved in the personalization.

4.1.1.1.2 State Modelling

This design strategy remains however very static. The architecture of the model required for the card is well defined, but there is no information describing the state of the smart card, and linking this model with the actual card state.

In order to understand the importance of this aspect, let’s consider the following example:

The configuration is very simple because the aim is to understand the “state modelling” concept: We assume that the card used is open platform and that it was already pre-personalized, with an applet inside i.e. that a specific applet has already been loaded on the card.

The diagram on the left describes the real state of the card (before or after the personalization process i.e. after script execution), and the one on the right represents the application profile with the modelling made by the user.

This approach is important in order to understand the differences between the modifications made on the model and their impact on the real state of the smart card.

References

Related documents

Coordinated through the market Willingly Elaborate mechanism Communication device Functioning remarkably well. Source:   Authors’   graphic

From the traditional theory, these things are dismissed one by one: mutation is a random accident (despite all the evidence to the contrary—traditional theory does not pro- vide

Level of workplace stress among staff nurses working at PICU in selected tertiary care hospital, Bangalore. Int J Health

The experiment seeks to investigate the efficacy of water deficit and different doses of gamma ray on some metric traits, i.e., seed index, yield index,

Field experiments were conducted at Ebonyi State University Research Farm during 2009 and 2010 farming seasons to evaluate the effect of intercropping maize with

In this present study, antidepressant activity and antinociceptive effects of escitalopram (ESC, 40 mg/kg) have been studied in forced swim test, tail suspension test, hot plate

The aim of this study was to identify specific genetic (i.e., DRD2 Taq1A, DRD4 48 bp VNTR, and OPRM1 A118G polymorphisms) and environmental mechanisms that underlie the emergence