Data-Driven Iconification

Expressive 2016

teaser

Four icon customization workflows supported by our algorithms. (a) Sketch-based pictogram modeling: given an input photo (top left) of a motorbike, the user sketches a polygon (top right) over the photo; this sketch, along with the keyword "motorbike," are the only user inputs (the photo is not used). Our method remixes partially similar pictograms (bottom left) to create a pictogram (bottom right) that matches the user's sketch. (b) Sketch-based pictogram editing: starting with an existing pictogram of a camera (top left), the user sketches a flash (top right, green blob) on top of the camera, and our method remixes the partially similar pictogram (bottom left) to create a pictogram of camera equipped with flash (bottom right). (c) Pictogram hybrids: Starting with several stock pictograms of boy faces (shown in our paper, Fig.13a), our method creates random yet visually appealing hybrids. (d) Pictogram montage: guided by the user’s scribbles (top, green), our method helps the user merge two pictograms while retaining the user-selected parts (bottom).

Abstract

Pictograms (icons) are ubiquitous in visual communication, but creating the best icon is not easy: users may wish to see a variety of possibilities before settling on a final form, and they might lack the ability to draw attractive and effective pictograms by themselves. We describe a system that synthesizes novel pictograms by remixing portions of icons retrieved from a large online repository. Depending on the user’s needs, the synthesis can be controlled by a number of interfaces ranging from sketch-based modeling and editing to fully-automatic hybrid generation and scribble-guided montage. Our system combines icon-specific algorithms for salient-region detection, shape matching, and multi-label graph-cut stitching to produce results in styles ranging from line drawings to solid shapes with interior structure.

Paper

PDF (9MB)

Supplementary Materials

Awards

Best Paper Award at Expresssive 2016.