12.07.2015 Views

Introduction to handwritten signature verification - ACE Electoral ...

Introduction to handwritten signature verification - ACE Electoral ...

Introduction to handwritten signature verification - ACE Electoral ...

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>Introduction</strong> <strong>to</strong> HandwrittenSignature VerificationDave Fen<strong>to</strong>nUniversity of OttawaSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 1/53


Handwritten <strong>signature</strong> <strong>verification</strong>Presentation overview (altered <strong>to</strong> remove all<strong>signature</strong>s):• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my researchSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 2/53


Goal of HSV• To verify a person’s identity based on the way inwhich he/she signs his/her name• Two types of system:• Offline systems use static features (the<strong>signature</strong> image)• Online systems use dynamic features (thetime series)• Written passwords are also under considerationSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 3/53


ApplicationsPrincipal application: reduce fraud in financialtransactions• Cannot rely on sales staff <strong>to</strong> visually verify<strong>signature</strong>s on credit card receipts• Occasional acceptances of forgeries are allowable• Rejections of valid <strong>signature</strong>s may irritatevaluable cus<strong>to</strong>mers• To date, used mostly for electronic <strong>signature</strong> ofbusiness documents (hash function protectsdocument against alteration)SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 4/53


Document <strong>verification</strong>• Apply hash function <strong>to</strong> document <strong>to</strong> generate hash code• If <strong>signature</strong> is valid, encrypt hash with signer’s private key• Recipient decodes received hash using public key• If document has been altered, hashes don’t matchSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 5/53


ApplicationsSecondary application: access security for buildingsor mobile computing devices• For building security, it would not be <strong>to</strong>lerable <strong>to</strong>accept forgeries• HSV would have <strong>to</strong> be combined with on-sitesecurity staff or other biometric/password/PINsystems• Already used on some lap<strong>to</strong>ps and PDAsSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 6/53


Naïve assumptions• A person signs his or her name consistently eachtime• All <strong>signature</strong>s contain enough steady features <strong>to</strong>be reliably verified• A forger cannot perfectly imitate the dynamicfeatures of a <strong>signature</strong>• All a user’s passwords can be replaced by his/her<strong>signature</strong>SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 7/53


Example of consistency – staticPassword: “Prejunife”21 July 2003 2 Sept 2003 24 Sep 2003SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 8/53


Example of consistency – dynamic50Y Velocity (cm/s)0−50500 0.5 1 1.5 2 2.5 30−50500 0.5 1 1.5 2 2.5 30−50500 0.5 1 1.5 2 2.5 30−50500 0.5 1 1.5 2 2.5 30−50500 0.5 1 1.5 2 2.5 30−50500 0.5 1 1.5 2 2.5 30−500 0.5 1 1.5 2 2.5 3Time (normalized)SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 9/53


Example of inconsistency – staticPassword: “Ingusions”12 Jan 2004 18 Mar 2004 27 Sep 2004SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 10/53


Example of inconsistency – dynamic50Y Velocity (cm/s)0−500 0.5 1 1.5 2 2.5500−500 0.5 1 1.5 2 2.5500−500 0.5 1 1.5 2 2.5500−500 0.5 1 1.5 2 2.5500−500 0.5 1 1.5 2 2.5500−500 0.5 1 1.5 2 2.5Time (normalized)SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 11/53


Example of forger ability – staticPassword: “Prejunife”SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 12/53


Example of forger ability – dynamic50Y Velocity (cm/s)0−500 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5500−500 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5200−200 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5500−500 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5200−200 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5500−500 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5Time (normalized)SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 13/53


More realistic assumptions• Most signers sign their names consistently• Most <strong>signature</strong>s contain enough steady features<strong>to</strong> be reliably verified• Most forgers cannot reproduce a <strong>signature</strong> wellenough <strong>to</strong> defeat a good verifier• It is more difficult <strong>to</strong> forge both the static anddynamic features of a <strong>signature</strong> than just thestatic featuresSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 14/53


Fallout from broken assumptions• It may not be possible <strong>to</strong> verify all <strong>signature</strong>sreliably• For any <strong>signature</strong>, there will probably exist askilled forger who can forge it competently• Serious consideration must be given <strong>to</strong> passwords• confidential• easily replaced if template compromised• can exert some control over the length(quality of features)• can request the signer <strong>to</strong> write legiblySPOT presentation, University of Ottawa, 29 Oct 2004 – p. 15/53


Handwritten <strong>signature</strong> <strong>verification</strong>Presentation overview:• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my researchSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 16/53


Basics of biometricsPhysical v. behavioural biometrics:• A physical biometric makes use of a fixedcharacteristic of the body (e.g. fingerprints, irispatterns, retina patterns, hand geometry, facialfeatures)• The most accurate methods are usually perceivedas <strong>to</strong>o intrusive.• A behavioural biometric makes use of personalbehaviours which are assumed <strong>to</strong> be almostinvariant (e.g. voice, handwriting, typing, gait)• Perceived as less intrusive, but less accurate thanphysical biometricsSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 17/53


Two stages• 1. Enrolment:• a user’s <strong>signature</strong> characteristics are learnedfrom a small number of input samples. Theresulting information is called the template.• Typically, 3 – 5 <strong>signature</strong>s are used• 2. Verification or recognition:• For <strong>verification</strong>, a candidate <strong>signature</strong> iscompared <strong>to</strong> the template of a single signer.• For recognition, the candidate <strong>signature</strong> mustbe compared against many templates.SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 18/53


FRR and FAR• Two error rates are specified:• The False Rejection Rate (FRR) is the rate atwhich valid <strong>signature</strong>s are rejected.• The False Acceptance Rate (FAR) is the rateat which forged <strong>signature</strong>s are accepted asvalid.• In many cases, low FRR implies high FAR, andvice-versa• Current state of the art: FRR and FAR sum <strong>to</strong>2 – 5%. Actual numbers may be even worse!SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 19/53


ROC curves• Most verifiers have a single numerical output. Ifthe output level is above a decision threshold, the<strong>signature</strong> is accepted as valid, otherwise it isrejected.• In this case, the FRR and FAR can both beplotted against the decision threshold in areceiver operating characteristic (ROC) curve.SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 20/53


Example ROC curves10.9False Acceptance Rate (FAR)False Rejection Rate (FRR)Error rate0.80.70.60.50.40.30.20.100 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5Decision thresholdSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 21/53


Example ROC curves10.9False Acceptance Rate (FAR)False Rejection Rate (FRR)Error rate0.80.70.60.50.40.30.20.1Equal Error Rate00 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5Decision thresholdSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 22/53


Types of forgery• Random. A random forgery is simply anotherperson’s valid <strong>signature</strong>.• Simple. The forger spells the name correctly, butwrites in his own style.• Skilled, or Knowledgeable. The forger tries <strong>to</strong>fully reproduce all the shapes and dynamics ofthe original <strong>signature</strong>. In this study, forgers areshown MPEG movies of the original <strong>signature</strong>.• The training set consists of a few valid <strong>signature</strong>sand many random forgeries.• After training, the verifier is tested against all 3types of forgery.SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 23/53


Genuine samplesPassword: “Taximotels”19 Sep 2003 16 Oct 2003 6 Nov 2003SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 24/53


ForgeriesPassword: “Taximotels”Random Simple KnowledgeableSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 25/53


Motivations for research• Despite company claims, error rates are high, andneed improvement• For computing devices with pen inputs (PDAs,tablet PCs), au<strong>to</strong>matic <strong>signature</strong> <strong>verification</strong> is asensible technology• Signatures are already a widely accepted meansof identificationSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 26/53


Handwritten <strong>signature</strong> <strong>verification</strong>Presentation overview:• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my researchSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 27/53


Data collection• Signatures collected using an InterlinkElectronics ePad-ink (100 Hz samp freq)• Captures X & Y position, pressure, time stamp• Data collection program written in C++• Data protected by PGPdiskSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 28/53


Data collection• Two levels of volunteer:• Level 1: one signing session• Level 2: three signing sessions• Each volunteer contributes:• 10 samples of genuine <strong>signature</strong>• 10 samples of genuine password• Simple forgeries of 2 <strong>signature</strong>s and 2passwords• Knowledgeable forgeries of a <strong>signature</strong> and apasswordSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 29/53


¦ © ¦¡ ¥¨¡¥ ¦¥¡¦£ ¤¥ ¦ ©¤§¨ ¢¡ ¦ £¥¦¨ ¤Enrolment¦¨¦ ©¨ ¦ ¥ ¦¨ £¥¦ ¦¨ £¥¦Acquire valid <strong>signature</strong>s:• Operational systems typically collect 3 – 5genuine <strong>signature</strong>s; academic systems up <strong>to</strong> 20• Some use warping and interpolation schemes <strong>to</strong>“create” extra valid <strong>signature</strong>sSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 31/53


& . /( &&- ! &* + &' ()##" !!& , & #, & #& ## $%EnrolmentPreprocess:• Concatenate strokes in<strong>to</strong> single time sequence• Render invariant <strong>to</strong>:• translation: subtract X & Y means• rotation: force linear regression line <strong>to</strong> be horizontal• scale: may normalize based on box size or signal power• Normalization of duration is not carried out at this stageSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 32/53


68> 6 F5 6G@ >>E 6 9 61>B C >5 81?5 6@5A16;;:45 6 7 8 94123 09 8 > 6D 68> 356;D 68> 356;> 356;8; 4


NPV N ^M N_X VV] N Q NIVZ [ VM PIWM NXMYINSSRLM N O P QLIJK HQ P V N\ NPV KMNS\ NPV KMNSV KMNSPS LTUEnrolmentSelect features:• With function features, typically use the same features foreach signer• Not all discrete features are equally informative• Cost used for feature selection is usually error rate;classifier dependent• Sequential forward/backward searchSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 34/53


nu f i fanr s ne haoe fpeqafkkjde f g h id`abci h n ffhn f ve fwp nt fhn cefkt fhn cefkn cefkhk dlmEnrolmentCreate template:• Best performance so far: keep raw data ofmultiple <strong>signature</strong>s• Bad practice, from security perspective• Template may also include list of features <strong>to</strong>keep, best classifier <strong>to</strong> use, decision thresholdsSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 35/53


• { ‚ †{| ƒ‡€ †} ‡y~ƒƒˆ‰ ‚} yVerification‚yƒ | ‡€ ~†} ‡y~ƒƒ‹ ‡Œ†}‚ ~}ƒ‚‚ ƒ{}~ƒŠ ~~ ‚ ”Ž ‚| ‡€†}Œ~{} ‡‚ {}~ƒ Ž ~y’ ~ ~ “ Š ”}~ƒ ‹ ‡€Š|„~ ‚ Ž ~‘~ | ~Œ† ~ ‚}‚ ~†}‡y~ƒƒ…} ~ ‚ ~ |} Ž yz{ x ~Œ† ‚|} ~ ‚ ~ ‚ {}~€| |„ x € ƒyz{y• Initial steps of <strong>verification</strong> are same as enrolment• Only selected features need <strong>to</strong> be extractedSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 36/53


—˜ š› œ – ž—© ¥ž¨š¢š œ¯œ±²› œ¡¬œœª¤Verification³ ¤« œ—¡ š ¥ž ¤› ¥—œ¡¡š ¡¥ž© ¥ª¤› ¤› ¥—œ¡¡ª œœ›¡— œ¡›§ š ¥ž¤› ›Ÿ¦¨œ› ¡œ› œ—œ ¬¨œ ° œœ› ® œ¤›¥—œ¡¡£›œ œ—˜Ÿ ›Ÿš–¡ š¢žœ­«œš› ª¤ ¬œ² ¥ ¬Ÿ ¡• Template ID is acquired at same time ascandidate <strong>signature</strong>• Designated by swipe card, PIN number, etc.• Template is usually s<strong>to</strong>red in central database, butmay also be held on swipe cardSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 37/53


´µ·Ä Å ¾¹ »µÇ üƸÀÊ » ¾ ºVerification¾Ñ· ¾ ·¸¿ ¸Ã¼ É ºµ ¹ õº¿¿¸¿Ã¼Â¹ õº¿¿Ç ÃÈ»¹¾ º¹¿¾¾·¹º¿¾¸Ã¼Â»¹»ÈºÆ º»Ê º ºµ ¾ º ½»·¹¾·¹º¿Æ º» Î ¸ºÍº¾¹ º ÌÁ¹ ºÂ¹Ãµº¿¿´µ·º ½¸ ½¾ º » ¾·¹º ¸¹ »¼¸À¿»¼ µÐ à ʽ ¿¾ ºÈ¾ º ËÉÏ Ð¹º¿¸¹ º ¾ ºÈÂÊ »• Many different classifiers have been tried• May have <strong>to</strong> combine results from multipleclassifiers• Will be covered in more detail laterSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 38/53


Handwritten <strong>signature</strong> <strong>verification</strong>Presentation overview:• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my researchSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 39/53


Technical difficulties• Physiology of handwriting is not well unders<strong>to</strong>od• Signers not motivated <strong>to</strong> sign in a careful,invariant manner• Training sets are sparse and badly imbalanced(few valid <strong>signature</strong>s)• No knowledgeable forgeries available for training• Short, variable <strong>signature</strong>s often easily forgedSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 40/53


Technical difficulties• No standard database of <strong>signature</strong>s and forgeries:• every researcher uses a different set ofamateur forgeries• some researchers test only against randomforgeries• FAR is ill-defined; a low error rate may reflectthe forgers’ lack of skill rather than theverifier’s abilitySPOT presentation, University of Ottawa, 29 Oct 2004 – p. 41/53


Technical work-arounds• Disqualify certain signers during enrolment• Allow multiple signing attempts• Allow probationary period with relaxedacceptance criteria (collect more training<strong>signature</strong>s)• Use passwords with a certain minimum lengthSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 42/53


Handwritten <strong>signature</strong> <strong>verification</strong>Presentation overview:• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV• Past research and the current state ofthe art• Overview of my researchSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 43/53


Past research• Research has been underway for several decades.Peak activity in mid-1980s <strong>to</strong> mid-1990s.• Early ideas are still good performers because theyhave fewer control parameters• Between 30 – 60% of forgeries can be detectedby a basic time verifierSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 44/53


Time verifierGenuine <strong>signature</strong>s20015010050Total Signing TimeAccepted by time verifierRejected by time verifierSample mean = 3.58Sample deviation = 1.5300 2 4 6 8 10 12 14 16 18 20Simple forgeries15010050Sample mean = 4.46Sample deviation = 1.6000 2 4 6 8 10 12 14 16 18 20Knowledgeable forgeries10050Sample mean = 6.11Sample deviation = 2.7400 2 4 6 8 10 12 14 16 18 20Time (seconds)SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 45/53


Classifiers• Euclidean distance• weighted linear metrics• regional correlation• dynamic time warping (DTW)• neural networks• hidden Markov models (HMMs)• Bayesian belief netSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 46/53


Features used• Function features (usually position, velocity andpressure)• Vec<strong>to</strong>rs of discrete features• Features calculated within sliding window (e.g.centre of mass, <strong>to</strong>rque)• Wavelet coefficients• LPC coefficients• Walsh transform of pen-up/pen-down signal• Pre-defined strokes (HMMs)SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 47/53


State of the art• In academic studies, more complicated verifiersoften achieve better results than simple verifiers• However, in field use, simple verifiers like DTWoften outperform everything else:• few adjustable parameters• with normalization, can set a single decisionthreshold for all signers• Best verifier in public contest: DTW with5-<strong>signature</strong> templateSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 49/53


State of the art• Most sophisticated verifier: Plamondon’sSign@metric solution• discrete parametric verifier• physiological delta-lognormal verifier• static feature verifier• claimed performance: error rate of 0.0003%among 86,500 people!!• Other companies that did not take part in publiccontest: CIC, Cyber-SIGN, SoftPro, WondernetSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 50/53


Handwritten <strong>signature</strong> <strong>verification</strong>Presentation overview:• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my researchSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 51/53


My research• Classifier comparison (DTW, NN, SVM,weighted distance metric)• Techniques <strong>to</strong> mitigate imbalance of training data• Re-open debate on the use of passwords• Data analysis across signing sessions• Feature selection algorithm that gives preferentialtreatment <strong>to</strong> features that are most likely <strong>to</strong> bestable• Use of support vec<strong>to</strong>r machineSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 52/53


Questions?To volunteer: please e-mail d.fen<strong>to</strong>n@ieee.orgSPOT presentation, University of Ottawa, 29 Oct 2004 – p. 53/53

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!