Day 89 - 100 Days of Swift
Project 27 (part 2)
Day 89 is the second part of the twenty-seventh project. You review the Core Graphics code from yesterday and then he gives you a few challenges. He challenges you to draw a replica of an emoji using the CGContext and he challenges you to draw the word “TWIN” in the CGContext. He also challenges you to go back to project 3 and add an overlay that says “From Storm Viewer” when the user shares an image.
For the first challenge, it was a lot of trial and error for me. What I ended up with was one big function:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
func drawEmoji() {
let size = CGSize(width: 512,
height: 512)
let renderer = UIGraphicsImageRenderer(size: size)
let image = renderer.image { context in
let rectangle = CGRect(origin: .zero,
size: size)
.insetBy(dx: 10, dy: 10)
let cgContext = context.cgContext
// Draw base yellow circle
let path = UIBezierPath(ovalIn: rectangle)
path.addClip()
let colorspace = CGColorSpace(name: CGColorSpace.sRGB)
let colors = [UIColor.yellow.cgColor,
UIColor.orange.cgColor]
let locations: [CGFloat] = [0.5, 1.0]
guard let gradient = CGGradient(colorsSpace: colorspace,
colors: colors as CFArray,
locations: locations)
else { return }
cgContext.clip()
let center = CGPoint(x: size.width/2,
y: size.height/2)
cgContext.drawRadialGradient(gradient,
startCenter: center,
startRadius: 0,
endCenter: center,
endRadius: size.width*0.65,
options: CGGradientDrawingOptions())
// Add eyes
let eyeOneRect = CGRect(x: 160,
y: 155,
width: 50,
height: 70)
cgContext.addEllipse(in: eyeOneRect)
let eyeTwoRect = CGRect(x: size.width-210,
y: 155,
width: 50,
height: 70)
cgContext.addEllipse(in: eyeTwoRect)
// Add mouth
let mouthStart = CGPoint(x: 80, y: 290)
let mouthEnd = CGPoint(x: size.width-80, y: 290)
cgContext.move(to: mouthStart)
cgContext.addCurve(to: mouthEnd,
control1: CGPoint(x: 100, y: 320),
control2: CGPoint(x: size.width-100, y: 320))
cgContext.addCurve(to: mouthStart,
control1: CGPoint(x: size.width-140, y: 480),
control2: CGPoint(x: 140, y: 480))
cgContext.setFillColor(UIColor.brown.cgColor)
cgContext.drawPath(using: .fill)
}
imageView.image = image
}
If you look closely at any of Apple’s actual emoji, you’ll find that basically everything is a gradient. I spent a lot of time figuring out how to get the first gradient set up and I still feel like I have a pretty tenuous grasp on what exactly is happening, so I didn’t spend too much time making gradients for the eyes or mouth. Once I got over that hurdle though, it was pretty much just a matter of drawing a couple of ellipses for eyes and a path that involves a couple of curves for the mouth. When it is all said and done, it looks like this:

For the second one it was just a matter of calling move(to:)
and addLine(to:)
in the right order to get the letters we were looking for:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
func drawTWIN() {
let size = CGSize(width: 512,
height: 512)
let renderer = UIGraphicsImageRenderer(size: size)
let image = renderer.image { context in
let cgContext = context.cgContext
let xOffset = 50
// T
cgContext.move(to: CGPoint(x: 50 + xOffset, y: 160))
cgContext.addLine(to: CGPoint(x: 130 + xOffset, y: 160))
cgContext.move(to: CGPoint(x: 90 + xOffset, y: 160))
cgContext.addLine(to: CGPoint(x: 90 + xOffset, y: 300))
// W
cgContext.move(to: CGPoint(x: 136 + xOffset, y: 160))
cgContext.addLine(to: CGPoint(x: 156 + xOffset, y: 290))
cgContext.addLine(to: CGPoint(x: 176 + xOffset, y: 180))
cgContext.addLine(to: CGPoint(x: 196 + xOffset, y: 290))
cgContext.addLine(to: CGPoint(x: 216 + xOffset, y: 160))
// I
cgContext.move(to: CGPoint(x: 232 + xOffset, y: 160))
cgContext.addLine(to: CGPoint(x: 232 + xOffset, y: 300))
// N
cgContext.move(to: CGPoint(x: 248 + xOffset, y: 300))
cgContext.addLine(to: CGPoint(x: 248 + xOffset, y: 165))
cgContext.addLine(to: CGPoint(x: 318 + xOffset, y: 295))
cgContext.addLine(to: CGPoint(x: 318 + xOffset, y: 160))
cgContext.setStrokeColor(UIColor.black.cgColor)
cgContext.setLineWidth(4)
cgContext.drawPath(using: .stroke)
}
imageView.image = image
}
When you render it, it looks like this:

For the third challenge, I added a function that returns an optional image and dropped it in where I was previously grabbing it from the imageView
:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
private func prepImage() -> UIImage? {
guard let image = imageView.image else { return nil }
let renderer = UIGraphicsImageRenderer(size: image.size)
let newImage = renderer.image { context in
image.draw(at: .zero)
let paragraphStyle = NSMutableParagraphStyle()
paragraphStyle.alignment = .center
let attrs: [NSAttributedString.Key: Any] = [
.font: UIFont.systemFont(ofSize: 64),
.paragraphStyle: paragraphStyle
]
let string = "From Storm Viewer"
let attributedString = NSAttributedString(string: string,
attributes: attrs)
let size = image.size
let height: CGFloat = 448
let width = height * 2
let stringRect = CGRect(x: (size.width - width) / 2,
y: (size.width - height) / 2,
width: width,
height: height)
attributedString.draw(with: stringRect,
options: .usesLineFragmentOrigin,
context: nil)
}
return newImage
}
// In share image
guard let image = prepImage()?.jpegData...
// Used to be guard let image = imageView.image?.jpegData
This leads to a nice and obnoxious “From Storm Viewer” stamped in the middle of the image. It wouldn’t work great on dark images, but it is good enough for me today.

You can find my version of these projects at the end of day 89 on Github here.