Single Shot Multibox Detection
In :numref:sec_bbox–:numref:sec_object-detection-dataset, we introduced bounding boxes, anchor boxes, multiscale object detection, and the dataset for object detection. Now we are ready to use such background knowledge to design an object detection model: single shot multibox detection (SSD) [203]. This model is simple, fast, and widely used. Although this is just one of vast amounts of object detection models, some of the design principles and implementation details in this section are also applicable to other models.
Model
Figure provides an overview of the design of single-shot multibox detection. This model mainly consists of a base network followed by several multiscale feature map blocks. The base network is for extracting features from the input image, so it can use a deep CNN. For example, the original single-shot multibox detection paper adopts a VGG network truncated before the classification layer [203], while ResNet has also been commonly used. Through our design we can make the base network output larger feature maps so as to generate more anchor boxes for detecting smaller objects. Subsequently, each multiscale feature map block reduces (e.g., by half) the height and width of the feature maps from the previous block, and enables each unit of the feature maps to increase its receptive field on the input image.
Recall the design of multiscale object detection through layerwise representations of images by deep neural networks in :numref:sec_multiscale-object-detection. Since multiscale feature maps closer to the top of Figure are smaller but have larger receptive fields, they are suitable for detecting fewer but larger objects.
In a nutshell, via its base network and several multiscale feature map blocks, single-shot multibox detection generates a varying number of anchor boxes with different sizes, and detects varying-size objects by predicting classes and offsets of these anchor boxes (thus the bounding boxes); thus, this is a multiscale object detection model.
As a multiscale object detection model, single-shot multibox detection mainly consists of a base network followed by several multiscale feature map blocks.
In the following, we will describe the implementation details of different blocks in Figure. To begin with, we discuss how to implement the class and bounding box prediction.
Class Prediction Layer
Let the number of object classes be sec_nin. Single-shot multibox detection uses the same technique to reduce model complexity.
Specifically, the class prediction layer uses a convolutional layer without altering width or height of feature maps. In this way, there can be a one-to-one correspondence between outputs and inputs at the same spatial dimensions (width and height) of feature maps. More concretely, channels of the output feature maps at any spatial position (
Below we define such a class prediction layer, specifying num_anchors and num_classes, respectively. This layer uses a
using Pkg;
Pkg.activate("../../d2lai")
using d2lai, Flux, CUDA, cuDNN, Statistics, Flux.Zygote, Plots
abstract type AbstractSSDBlock <: d2lai.AbstractModel end
struct ClassPredictor{N} <: AbstractSSDBlock
net::N
end
Flux.@layer ClassPredictor
(c::ClassPredictor)(x) = c.net(x)
function ClassPredictor(num_inputs::Int64, num_anchors::Int64, num_classes::Int64)
net = Conv(
(3,3),
num_inputs => num_anchors * (num_classes + 1),
pad = 1
) |> f64
ClassPredictor(net)
end Activating project at `~/d2l-julia/d2lai`
ClassPredictorBounding Box Prediction Layer
The design of the bounding box prediction layer is similar to that of the class prediction layer. The only difference lies in the number of outputs for each anchor box: here we need to predict four offsets rather than
struct BboxPredictor{N} <: AbstractSSDBlock
net::N
end
function BboxPredictor(num_inputs::Int64, num_anchors::Int64)
net = Conv(
(3,3),
num_inputs => num_anchors * 4,
pad = 1
) |> f64
BboxPredictor(net)
end
(c::BboxPredictor)(x) = c.net(x)Concatenating Predictions for Multiple Scales
As we mentioned, single-shot multibox detection uses multiscale feature maps to generate anchor boxes and predict their classes and offsets. At different scales, the shapes of feature maps or the numbers of anchor boxes centered on the same unit may vary. Therefore, shapes of the prediction outputs at different scales may vary.
In the following example, we construct feature maps at two different scales, Y1 and Y2, for the same minibatch, where the height and width of Y2 are half of those of Y1. Let's take class prediction as an example. Suppose that 5 and 3 anchor boxes are generated for every unit in Y1 and Y2, respectively. Suppose further that the number of object classes is 10. For feature maps Y1 and Y2 the numbers of channels in the class prediction outputs are
function d2lai.forward(x, block::AbstractSSDBlock)
block(x)
end
Y1 = forward(rand(20, 20, 8, 2), ClassPredictor(8, 5, 10))
Y2 = forward(rand(10, 10, 16, 2), ClassPredictor(16, 3, 10))
@info size(Y1)
@info size(Y2)[Info] (20, 20, 55, 2)
[Info] (10, 10, 33, 2)As we can see, except for the batch size dimension, the other three dimensions all have different sizes. To concatenate these two prediction outputs for more efficient computation, we will transform these tensors into a more consistent format.
Note that the channel dimension holds the predictions for anchor boxes with the same center. We first move this dimension to the innermost. Since the batch size remains the same for different scales, we can transform the prediction output into a two-dimensional tensor with shape (batch size, height
function concat_pred(preds, classes)
flatten_preds = map(preds) do pred
reshape(permutedims(pred, (3, 2, 1, 4)), 10, :, size(pred, 4))
end
reduce((a...) -> cat(a..., dims = 2), flatten_preds)
endconcat_pred (generic function with 1 method)In this way, even though Y1 and Y2 have different sizes in channels, heights, and widths, we can still concatenate these two prediction outputs at two different scales for the same minibatch.
concat_pred([Y1, Y2], 10) |> size(10, 2530, 2)Downsampling Block
In order to detect objects at multiple scales, we define the following downsampling block down_sample_blk that halves the height and width of input feature maps. In fact, this block applies the design of VGG blocks in :numref:subsec_vgg-blocks. More concretely, each downsampling block consists of two
struct DownSampleBlk{N} <: AbstractSSDBlock
net::N
end
Flux.@layer DownSampleBlk
(d::DownSampleBlk)(x) = d.net(x)
function DownSampleBlk(in_channels, out_channels)
blk = []
for _ in 1:2
push!(blk, Conv((3,3), in_channels => out_channels, pad = 1))
push!(blk, BatchNorm(out_channels)),
push!(blk, relu)
in_channels = out_channels
end
net = Chain(blk..., MaxPool((2,2))) |> f64
return DownSampleBlk(net)
end
forward(zeros(20, 20, 3, 2), DownSampleBlk(3, 10)) |> size(10, 10, 10, 2)Base Network Block
The base network block is used to extract features from input images. For simplicity, we construct a small base network consisting of three downsampling blocks that double the number of channels at each block. Given a
struct BaseNet{N} <: AbstractSSDBlock
net::N
end
Flux.@layer BaseNet
(b::BaseNet)(x) = b.net(x)
function BaseNet()
blks = []
num_filters = [3, 16, 32, 64]
for i in 1:length(num_filters)-1
blk = DownSampleBlk(num_filters[i], num_filters[i+1])
push!(blks, blk)
end
return BaseNet(Chain(blks...)) |> f64
end
@info size(BaseNet()(rand(256, 256, 3, 2)))[Info] (32, 32, 64, 2)The Complete Model
The complete single shot multibox detection model consists of five blocks. The feature maps produced by each block are used for both (i) generating anchor boxes and (ii) predicting classes and offsets of these anchor boxes. Among these five blocks, the first one is the base network block, the second to the fourth are downsampling blocks, and the last block uses global max-pooling to reduce both the height and width to 1. Technically, the second to the fifth blocks are all those multiscale feature map blocks in Figure.
function get_blk(i)
blk = if i == 1
blk = BaseNet()
elseif i == 2
blk = DownSampleBlk(64, 128)
elseif i == 5
blk = GlobalMaxPool()
else
blk = DownSampleBlk(128, 128)
end
return blk
endget_blk (generic function with 1 method)Now we define the forward propagation for each block. Different from in image classification tasks, outputs here include (i) CNN feature maps Y, (ii) anchor boxes generated using Y at the current scale, and (iii) classes and offsets predicted (based on Y) for these anchor boxes.
function blk_forward(X, blk, sz, ratio, cls_predictor, bbox_predictor)
Y = blk(X)
anchors = multibox_prior(Y, sz, ratio)
cls_preds = cls_predictor(Y)
blk_preds = bbox_predictor(Y)
return Y, anchors, cls_preds, blk_preds
endblk_forward (generic function with 1 method)Recall that in Figure a multiscale feature map block that is closer to the top is for detecting larger objects; thus, it needs to generate larger anchor boxes. In the above forward propagation, at each multiscale feature map block we pass in a list of two scale values via the sizes argument of the invoked multibox_prior function (described in :numref:sec_anchor). In the following, the interval between 0.2 and 1.05 is split evenly into five sections to determine the smaller scale values at the five blocks: 0.2, 0.37, 0.54, 0.71, and 0.88. Then their larger scale values are given by
sizes = [[0.2, 0.272], [0.37, 0.447], [0.54, 0.619], [0.71, 0.79],
[0.88, 0.961]]
ratios = repeat([[1, 2, 0.5]], 5)
num_anchors = length(sizes[1]) + length(ratios[1]) - 14Now we can define the complete model TinySSD as follows.
struct TinySSD{B, CP, BP, A} <: AbstractClassifier
blocks::B
class_predictors::CP
bbox_predictors::BP
args::A
end
Flux.@layer TinySSD trainable=(blocks, class_predictors, bbox_predictors)
function TinySSD(num_classes; kw...)
idx_to_in_channels = [64, 128, 128, 128, 128]
blocks = map(1:5) do i
get_blk(i)
end
class_predictors = map(1:5) do i
ClassPredictor(idx_to_in_channels[i], num_anchors, num_classes)
end
bbox_predictors = map(1:5) do i
BboxPredictor(idx_to_in_channels[i], num_anchors)
end
TinySSD(blocks, class_predictors, bbox_predictors, (; num_classes, kw...)) |> f64
end
function (model::TinySSD)(x::AbstractArray)
blocks, class_predictors, bbox_predictors = model.blocks, model.class_predictors, model.bbox_predictors
sizes, ratios = model.args.sizes, model.args.ratios
anchors = AbstractArray{<:Real}[]
cls_preds = AbstractArray{<:Real}[]
bbox_preds = AbstractArray{<:Real}[]
batch_size = size(x)[end]
for i in 1:length(blocks)
blk, class_predictor, bbox_predictor, sz, rt = blocks[i], class_predictors[i], bbox_predictors[i], sizes[i], ratios[i]
x, anchors_i, cls_preds_i, bbox_preds_i = blk_forward(x, blk, sz, rt, class_predictor, bbox_predictor)
anchors = [anchors; [anchors_i]]
cls_preds = [cls_preds; [reshape(permutedims(cls_preds_i, (3, 2, 1, 4)), model.args.num_classes+1, :, batch_size)]]
bbox_preds = [bbox_preds; [reshape(permutedims(bbox_preds_i, (3, 2, 1, 4)), :, batch_size)]]
end
anchors = reduce(vcat, anchors)
# cls_preds = reduce(vcat, cls_preds)
cls_preds = reduce((a...) -> cat(a..., dims = 2), cls_preds)
bbox_preds = reduce(vcat, bbox_preds)
anchors, cls_preds, bbox_preds
endWe create a model instance and use it to perform forward propagation on a minibatch of X.
As shown earlier in this section, the first block outputs
model = TinySSD(1; sizes, ratios) |> f64 |> gpu
X = zeros(256, 256, 3, 1) |> gpu
anchors, cls_preds, bbox_preds = model(X)
println("output anchors:", size(anchors))
println("output class preds:", size(cls_preds))
print("output bbox preds:", size(bbox_preds))output anchors:(5444, 4, 1)
output class preds:(2, 5444, 1)
output bbox preds:(21776, 1)Training
Now we will explain how to train the single shot multibox detection model for object detection.
Reading the Dataset and Initializing the Model
To begin with, let's read the banana detection dataset described in :numref:sec_object-detection-dataset.
data = d2lai.BananaDataset(; batchsize = 32)Data object of type d2lai.BananaDatasetAfter defining the model, we need to define the optimization algorithm. We will be using our Trainer interface
trainer = Trainer(model, data, Adam(0.1); max_epochs = 20, board_yscale = :linear)Trainer{TinySSD{Vector{Any}, Vector{ClassPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}}, Vector{BboxPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}}, @NamedTuple{num_classes::Int64, sizes::Vector{CuArray{Float32, 1, CUDA.DeviceMemory}}, ratios::Vector{CuArray{Float32, 1, CUDA.DeviceMemory}}}}, d2lai.BananaDataset{Tuple{Array{Float32, 4}, Array{Float64, 3}}, Tuple{Array{Float32, 4}, Array{Float64, 3}}, @NamedTuple{extracted_folder::String, batchsize::Int64}}, ProgressBoard, Adam, NamedTuple}(TinySSD{Vector{Any}, Vector{ClassPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}}, Vector{BboxPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}}, @NamedTuple{num_classes::Int64, sizes::Vector{CuArray{Float32, 1, CUDA.DeviceMemory}}, ratios::Vector{CuArray{Float32, 1, CUDA.DeviceMemory}}}}(Any[BaseNet{Chain{Tuple{DownSampleBlk{Chain{Tuple{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), MaxPool{2, 4}}}}, DownSampleBlk{Chain{Tuple{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), MaxPool{2, 4}}}}, DownSampleBlk{Chain{Tuple{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), MaxPool{2, 4}}}}}}}(Chain(DownSampleBlk{Chain{Tuple{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), MaxPool{2, 4}}}}(Chain(Conv((3, 3), 3 => 16, pad=1), BatchNorm(16), relu, Conv((3, 3), 16 => 16, pad=1), BatchNorm(16), relu, MaxPool((2, 2)))), DownSampleBlk{Chain{Tuple{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), MaxPool{2, 4}}}}(Chain(Conv((3, 3), 16 => 32, pad=1), BatchNorm(32), relu, Conv((3, 3), 32 => 32, pad=1), BatchNorm(32), relu, MaxPool((2, 2)))), DownSampleBlk{Chain{Tuple{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), MaxPool{2, 4}}}}(Chain(Conv((3, 3), 32 => 64, pad=1), BatchNorm(64), relu, Conv((3, 3), 64 => 64, pad=1), BatchNorm(64), relu, MaxPool((2, 2)))))), DownSampleBlk{Chain{Tuple{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), MaxPool{2, 4}}}}(Chain(Conv((3, 3), 64 => 128, pad=1), BatchNorm(128), relu, Conv((3, 3), 128 => 128, pad=1), BatchNorm(128), relu, MaxPool((2, 2)))), DownSampleBlk{Chain{Tuple{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), MaxPool{2, 4}}}}(Chain(Conv((3, 3), 128 => 128, pad=1), BatchNorm(128), relu, Conv((3, 3), 128 => 128, pad=1), BatchNorm(128), relu, MaxPool((2, 2)))), DownSampleBlk{Chain{Tuple{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}, BatchNorm{typeof(identity), CuArray{Float32, 1, CUDA.DeviceMemory}, Float32, CuArray{Float32, 1, CUDA.DeviceMemory}}, typeof(relu), MaxPool{2, 4}}}}(Chain(Conv((3, 3), 128 => 128, pad=1), BatchNorm(128), relu, Conv((3, 3), 128 => 128, pad=1), BatchNorm(128), relu, MaxPool((2, 2)))), GlobalMaxPool()], ClassPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}[ClassPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 64 => 8, pad=1)), ClassPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 128 => 8, pad=1)), ClassPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 128 => 8, pad=1)), ClassPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 128 => 8, pad=1)), ClassPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 128 => 8, pad=1))], BboxPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}[BboxPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 64 => 16, pad=1)), BboxPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 128 => 16, pad=1)), BboxPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 128 => 16, pad=1)), BboxPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 128 => 16, pad=1)), BboxPredictor{Conv{2, 4, typeof(identity), CuArray{Float32, 4, CUDA.DeviceMemory}, CuArray{Float32, 1, CUDA.DeviceMemory}}}(Conv((3, 3), 128 => 16, pad=1))], (num_classes = 1, sizes = CuArray{Float32, 1, CUDA.DeviceMemory}[Float32[0.2, 0.272], Float32[0.37, 0.447], Float32[0.54, 0.619], Float32[0.71, 0.79], Float32[0.88, 0.961]], ratios = CuArray{Float32, 1, CUDA.DeviceMemory}[Float32[1.0, 2.0, 0.5], Float32[1.0, 2.0, 0.5], Float32[1.0, 2.0, 0.5], Float32[1.0, 2.0, 0.5], Float32[1.0, 2.0, 0.5]])), d2lai.BananaDataset{Tuple{Array{Float32, 4}, Array{Float64, 3}}, Tuple{Array{Float32, 4}, Array{Float64, 3}}, @NamedTuple{extracted_folder::String, batchsize::Int64}}((Float32[0.8509804 0.8745098 … 0.05882353 0.10980392; 0.80784315 0.8117647 … 0.07058824 0.11372549; … ; 0.7882353 0.7921569 … 0.21568628 0.27058825; 0.7764706 0.7647059 … 0.23137255 0.26666668;;; 0.8392157 0.8627451 … 0.039215688 0.09019608; 0.79607844 0.8117647 … 0.050980393 0.09411765; … ; 0.77254903 0.78039217 … 0.22352941 0.28235295; 0.7607843 0.7490196 … 0.23921569 0.2784314;;; 0.94509804 0.9764706 … 0.015686275 0.06666667; 0.9019608 0.9137255 … 0.02745098 0.07058824; … ; 0.8392157 0.84705883 … 0.16862746 0.21568628; 0.81960785 0.80784315 … 0.18431373 0.21176471;;;; 0.078431375 0.101960786 … 0.105882354 0.050980393; 0.05490196 0.09019608 … 0.08235294 0.078431375; … ; 0.09411765 0.1254902 … 0.11372549 0.11764706; 0.11764706 0.11372549 … 0.14901961 0.12156863;;; 0.09803922 0.12156863 … 0.1254902 0.07058824; 0.07450981 0.10980392 … 0.101960786 0.09803922; … ; 0.078431375 0.10980392 … 0.105882354 0.10980392; 0.105882354 0.09803922 … 0.14117648 0.11372549;;; 0.11372549 0.13333334 … 0.101960786 0.047058824; 0.09019608 0.12156863 … 0.078431375 0.07450981; … ; 0.03137255 0.0627451 … 0.11764706 0.12156863; 0.047058824 0.050980393 … 0.16078432 0.13333334;;;; 0.25490198 0.2509804 … 0.62352943 0.5411765; 0.24705882 0.21568628 … 0.74509805 0.77254903; … ; 0.101960786 0.15294118 … 0.2627451 0.13333334; 0.08627451 0.11372549 … 0.27450982 0.1254902;;; 0.29411766 0.30588236 … 0.6 0.5176471; 0.3019608 0.27450982 … 0.7058824 0.72156864; … ; 0.21568628 0.28627452 … 0.38039216 0.25490198; 0.20392157 0.2509804 … 0.39215687 0.24705882;;; 0.05490196 0.06666667 … 0.16078432 0.047058824; 0.07058824 0.043137256 … 0.18039216 0.16862746; … ; 0.02745098 0.08627451 … 0.13725491 0.0; 0.007843138 0.039215688 … 0.14901961 0.0;;;; … ;;;; 0.1882353 0.13333334 … 0.22745098 0.23137255; 0.15686275 0.21176471 … 0.23921569 0.19607843; … ; 0.43529412 0.4627451 … 0.21568628 0.19607843; 0.4392157 0.47058824 … 0.25490198 0.19215687;;; 0.18039216 0.13333334 … 0.21960784 0.21568628; 0.14901961 0.21176471 … 0.22352941 0.17254902; … ; 0.41568628 0.43529412 … 0.17254902 0.15686275; 0.42745098 0.45490196 … 0.20784314 0.14901961;;; 0.08235294 0.023529412 … 0.06666667 0.078431375; 0.0 0.0627451 … 0.078431375 0.039215688; … ; 0.14901961 0.16862746 … 0.0627451 0.05882353; 0.14901961 0.1764706 … 0.105882354 0.0627451;;;; 0.8901961 0.87058824 … 0.8862745 0.8627451; 0.8784314 0.8745098 … 0.88235295 0.81960785; … ; 0.69411767 0.6862745 … 0.19215687 0.26666668; 0.69803923 0.6901961 … 0.24313726 0.27450982;;; 0.827451 0.8 … 0.7607843 0.7882353; 0.8156863 0.80784315 … 0.75686276 0.74509805; … ; 0.6156863 0.60784316 … 0.27058825 0.34509805; 0.6156863 0.60784316 … 0.30980393 0.34117648;;; 0.7294118 0.7058824 … 0.6117647 0.65882355; 0.7137255 0.7058824 … 0.60784316 0.6156863; … ; 0.52156866 0.5137255 … 0.23529412 0.30980393; 0.53333336 0.5254902 … 0.27058825 0.3019608;;;; 0.6627451 0.7137255 … 0.5686275 0.99215686; 0.63529414 0.6392157 … 0.6156863 0.9764706; … ; 0.050980393 0.043137256 … 0.3372549 0.9607843; 0.043137256 0.03137255 … 0.34509805 0.9764706;;; 0.57254905 0.6156863 … 0.52156866 0.9647059; 0.54509807 0.5411765 … 0.5686275 0.9490196; … ; 0.0627451 0.05490196 … 0.33333334 0.95686275; 0.05882353 0.047058824 … 0.34117648 0.96862745;;; 0.19607843 0.2509804 … 0.41960785 0.8627451; 0.1764706 0.18039216 … 0.46666667 0.84705883; … ; 0.019607844 0.011764706 … 0.2509804 0.8862745; 0.003921569 0.0 … 0.2627451 0.9098039], [0.0 0.40625 … 0.55859375 0.2265625;;; 0.0 0.265625 … 0.4609375 0.87109375;;; 0.0 0.63671875 … 0.8515625 0.93359375;;; … ;;; 0.0 0.18359375 … 0.3359375 0.42578125;;; 0.0 0.16796875 … 0.3515625 0.6484375;;; 0.0 0.74609375 … 0.97265625 0.59375]), (Float32[0.14117648 0.14509805 … 0.06666667 0.06666667; 0.14901961 0.14901961 … 0.07058824 0.07058824; … ; 0.039215688 0.03137255 … 0.007843138 0.007843138; 0.03137255 0.03137255 … 0.007843138 0.007843138;;; 0.11764706 0.12156863 … 0.078431375 0.078431375; 0.1254902 0.1254902 … 0.08235294 0.08235294; … ; 0.015686275 0.015686275 … 0.007843138 0.007843138; 0.015686275 0.015686275 … 0.007843138 0.007843138;;; 0.13333334 0.13725491 … 0.09803922 0.09803922; 0.14117648 0.14117648 … 0.101960786 0.101960786; … ; 0.015686275 0.011764706 … 0.007843138 0.007843138; 0.011764706 0.011764706 … 0.007843138 0.007843138;;;; 0.44313726 0.47058824 … 0.4509804 0.46666667; 0.43529412 0.45490196 … 0.42745098 0.4745098; … ; 0.52156866 0.53333336 … 0.5137255 0.5137255; 0.52156866 0.53333336 … 0.5176471 0.5176471;;; 0.44705883 0.4745098 … 0.44705883 0.4627451; 0.4392157 0.4627451 … 0.42352942 0.47058824; … ; 0.5176471 0.5294118 … 0.43529412 0.43529412; 0.5176471 0.5294118 … 0.4392157 0.4392157;;; 0.42745098 0.44313726 … 0.42745098 0.44313726; 0.40784314 0.41960785 … 0.40392157 0.4509804; … ; 0.5019608 0.5137255 … 0.39215687 0.39215687; 0.5019608 0.5137255 … 0.39607844 0.39607844;;;; 0.83137256 0.83137256 … 0.83137256 0.83137256; 0.83137256 0.83137256 … 0.83137256 0.83137256; … ; 0.49411765 0.49803922 … 0.65882355 0.654902; 0.5882353 0.49019608 … 0.5411765 0.5686275;;; 0.8666667 0.8666667 … 0.8666667 0.8666667; 0.8666667 0.8666667 … 0.8666667 0.8666667; … ; 0.25882354 0.27450982 … 0.44705883 0.4509804; 0.34901962 0.25882354 … 0.33333334 0.36078432;;; 0.89411765 0.89411765 … 0.89411765 0.89411765; 0.89411765 0.89411765 … 0.89411765 0.89411765; … ; 0.011764706 0.011764706 … 0.21176471 0.21568628; 0.09411765 0.0 … 0.06666667 0.08627451;;;; … ;;;; 0.23529412 0.28235295 … 0.29803923 0.27058825; 0.2509804 0.24705882 … 0.09803922 0.11764706; … ; 0.32941177 0.2 … 0.08627451 0.24705882; 0.22352941 0.36078432 … 0.019607844 0.16470589;;; 0.09019608 0.11764706 … 0.25490198 0.21568628; 0.101960786 0.08235294 … 0.0627451 0.078431375; … ; 0.16862746 0.05490196 … 0.08235294 0.2509804; 0.08235294 0.22352941 … 0.015686275 0.16470589;;; 0.019607844 0.0627451 … 0.13725491 0.11372549; 0.02745098 0.02745098 … 0.0 0.0; … ; 0.08235294 0.0 … 0.0 0.12941177; 0.0 0.14901961 … 0.0 0.05490196;;;; 0.15686275 0.15686275 … 0.30980393 0.3137255; 0.16470589 0.16470589 … 0.3137255 0.31764707; … ; 0.45882353 0.4745098 … 0.70980394 0.70980394; 0.42352942 0.3882353 … 0.6901961 0.654902;;; 0.26666668 0.26666668 … 0.40784314 0.4117647; 0.27450982 0.27450982 … 0.4117647 0.41568628; … ; 0.45882353 0.47843137 … 0.6156863 0.6156863; 0.42352942 0.39215687 … 0.59607846 0.56078434;;; 0.1764706 0.1764706 … 0.18039216 0.18431373; 0.18431373 0.18431373 … 0.18431373 0.1882353; … ; 0.42745098 0.45490196 … 0.46666667 0.46666667; 0.38431373 0.36078432 … 0.44705883 0.4117647;;;; 0.16862746 0.18039216 … 0.22745098 0.32941177; 0.2 0.17254902 … 0.28627452 0.3372549; … ; 0.15686275 0.14117648 … 0.09803922 0.050980393; 0.16862746 0.12156863 … 0.12941177 0.14509805;;; 0.1764706 0.1882353 … 0.2 0.29803923; 0.20784314 0.18039216 … 0.25490198 0.30588236; … ; 0.21960784 0.20392157 … 0.16862746 0.12156863; 0.23137255 0.18431373 … 0.2 0.21568628;;; 0.16470589 0.1764706 … 0.16078432 0.25490198; 0.19607843 0.16862746 … 0.21176471 0.25490198; … ; 0.1764706 0.16470589 … 0.21568628 0.16078432; 0.1882353 0.14117648 … 0.25490198 0.2627451], [0.0 0.71484375 … 0.94140625 0.4375;;; 0.0 0.1015625 … 0.30859375 0.51953125;;; 0.0 0.54296875 … 0.6953125 0.578125;;; … ;;; 0.0 0.30859375 … 0.50390625 0.48046875;;; 0.0 0.65234375 … 0.80078125 0.34765625;;; 0.0 0.7421875 … 0.93359375 0.45703125]), (extracted_folder = "/tmp/jl_dBTMdp", batchsize = 32)), ProgressBoard("epochs", :identity, :linear, Any[:blue, :red, :orange], (25, 25), Plot{Plots.GRBackend() n=0}, Dict{Any, Any}(), Animation("/tmp/jl_998YPQ", String[])), Adam(0.1, (0.9, 0.999), 1.0e-8), (verbose = true, gradient_clip_val = 0.0, max_epochs = 20))Defining Loss and Evaluation Functions
Object detection has two types of losses. The first loss concerns classes of anchor boxes: its computation can simply reuse the cross-entropy loss function that we used for image classification. The second loss concerns offsets of positive (non-background) anchor boxes: this is a regression problem. For this regression problem, however, here we do not use the squared loss described in :numref:subsec_normal_distribution_and_squared_loss. Instead, we use the bbox_masks filters out negative anchor boxes and illegal (padded) anchor boxes in the loss calculation. In the end, we sum up the anchor box class loss and the anchor box offset loss to obtain the loss function for the model.
function cls_loss(model::TinySSD, y::AbstractArray, y_pred::AbstractArray)
loss = Flux.Losses.logitcrossentropy(y_pred, Flux.onehotbatch(y, 0:model.args.num_classes); agg = identity)
# loss = dropdims(loss, dims = 1)
end
function bbox_loss(args...)
loss = Flux.Losses.mae(args...; agg = identity)
# mean(loss; dims = 1)
end
function calc_loss(model::TinySSD, cls_preds, cls_labels, bbox_preds, bbox_labels, bbox_masks)
batch_size = size(cls_preds, 3)
num_anchors = size(cls_preds, 2)
num_classes = size(cls_preds, 1)
# Reshape to match PyTorch: (batch * anchors, num_classes)
cls_preds_flat = reshape(cls_preds, num_classes, :)
cls_labels_flat = vec(cls_labels)
cls_loss_flat = cls_loss(model, Int.(cls_labels_flat), cls_preds_flat)
cls_loss_per_image = reshape(cls_loss_flat, num_anchors, batch_size)
cls_loss_mean = mean(cls_loss_per_image; dims=1) # shape: (1, batch_size)
# # bbox loss: already (anchors, 4, batch)
bbox_loss_raw = bbox_loss(bbox_preds.* bbox_masks, bbox_labels.* bbox_masks)
bbox_loss_per_image = mean(bbox_loss_raw; dims=1) # shape: (1, batch_size)
total_loss_per_image = cls_loss_mean + bbox_loss_per_image
mean(total_loss_per_image)
# return dropdims(total_loss_per_image; dims=1) # shape: (batch_size,)
endcalc_loss (generic function with 1 method)We can use accuracy to evaluate the classification results. Due to the used
function cls_eval(cls_preds, cls_labels)
arg_ = getindex.(argmax(cls_preds; dims = 1), 1)
sum(dropdims(arg_, dims = 1) .- 1 .== cls_labels) / length(cls_labels)
end
function bbox_eval(bbox_preds, bbox_labels, bbox_masks)
sum(abs.((bbox_labels - bbox_preds) .* bbox_masks)) / length(bbox_labels)
endbbox_eval (generic function with 1 method)Training the Model
When training the model, we need to generate multiscale anchor boxes (anchors) and predict their classes (cls_preds) and offsets (bbox_preds) in the forward propagation. Then we label the classes (cls_labels) and offsets (bbox_labels) of such generated anchor boxes based on the label information Y. Finally, we calculate the loss function using the predicted and labeled values of the classes and offsets. For concise implementations, evaluation of the test dataset is omitted here.
function train_model(trainer::Trainer)
train_iter = d2lai.get_dataloader(trainer.data) |> gpu
model = trainer.model
state = Flux.setup(trainer.opt, model)
for i in 1:trainer.args.max_epochs
losses = (loss = [], class_error = [], bbox_error = [])
for d_ in train_iter
current_loss, gs = Zygote.withgradient(model) do m
anchors_, cls_preds, bbox_preds = m(d_[1]);
bbox_labels, bbox_masks, cls_labels = Zygote.@ignore multibox_target(anchors_, d_[2], gpu; iou_threshold = 0.5)
loss = calc_loss(m, cls_preds, cls_labels, bbox_preds, bbox_labels, bbox_masks)
end
Flux.Optimise.update!(state, model, gs[1])
push!(losses.loss, current_loss)
anchors_, cls_preds, bbox_preds = model(d_[1]);
bbox_labels, bbox_masks, cls_labels = multibox_target(anchors_, d_[2], gpu; iou_threshold = 0.5)
class_error = cls_eval(cls_preds, cls_labels)
bbox_error = bbox_eval(bbox_preds, bbox_labels, bbox_masks)
push!(losses.class_error, 1 - class_error)
push!(losses.bbox_error, bbox_error)
end
metrics = (cls_error = losses.class_error, bbox_error = losses.bbox_error)
d2lai.draw_metrics(model, i, trainer, metrics)
@info "Epoch : $i, Loss: $(mean(losses.loss)), Class Err: $(mean(losses.class_error)), BBOX Err: $(mean(losses.bbox_error))"
end
Plots.display(trainer.board.plt)
model
endtrain_model (generic function with 1 method)model = train_model(trainer);[Info] Epoch : 1, Loss: 0.14224652125671278, Class Err: 0.004857519459496706, BBOX Err: 161.49789341041767
[Info] Epoch : 2, Loss: 0.029414293285683277, Class Err: 0.004842451265154302, BBOX Err: 0.005319348648518894
[Info] Epoch : 3, Loss: 0.02514334365084119, Class Err: 0.004855905010102874, BBOX Err: 0.003800824125404921
[Info] Epoch : 4, Loss: 0.02409084043051493, Class Err: 0.00484729461333579, BBOX Err: 0.0034777254928469604
[Info] Epoch : 5, Loss: 0.021424882059647453, Class Err: 0.004888732147777377, BBOX Err: 0.0034864238285626206
[Info] Epoch : 6, Loss: 0.021341063367222822, Class Err: 0.004861824657880241, BBOX Err: 0.004651389364080739
[Info] Epoch : 7, Loss: 0.019532552771021165, Class Err: 0.004867744305657611, BBOX Err: 0.003428335620717753
[Info] Epoch : 8, Loss: 0.01842878948550254, Class Err: 0.004842451265154309, BBOX Err: 0.0033611699389536967
[Info] Epoch : 9, Loss: 0.020131704833401057, Class Err: 0.004910975672759006, BBOX Err: 0.0036245290354331066
[Info] Epoch : 10, Loss: 0.01935703480412732, Class Err: 0.005834799492560626, BBOX Err: 0.003743365802154613
[Info] Epoch : 11, Loss: 0.01801922697002491, Class Err: 0.00548625780675974, BBOX Err: 0.003591899188606131
[Info] Epoch : 12, Loss: 0.017102654301792565, Class Err: 0.004726928441862593, BBOX Err: 0.0033941975845013465
[Info] Epoch : 13, Loss: 0.016329028116634652, Class Err: 0.004443861648144729, BBOX Err: 0.0032768133618152743
[Info] Epoch : 14, Loss: 0.016027264431979063, Class Err: 0.0046200160153379854, BBOX Err: 0.0036231853862204485
[Info] Epoch : 15, Loss: 0.01528508755584546, Class Err: 0.00447543310295738, BBOX Err: 0.0032866238506892693
[Info] Epoch : 16, Loss: 0.01515382626419919, Class Err: 0.004380001205455539, BBOX Err: 0.003294852439656169
[Info] Epoch : 17, Loss: 0.01506004249849537, Class Err: 0.004412110810066115, BBOX Err: 0.003404000311330696
[Info] Epoch : 18, Loss: 0.015148414162186935, Class Err: 0.004606921036921369, BBOX Err: 0.003321402526378537
[Info] Epoch : 19, Loss: 0.015330778208342381, Class Err: 0.008824939153196178, BBOX Err: 0.00371015740626703
[Info] Epoch : 20, Loss: 0.015630712855222748, Class Err: 0.00426376084909991, BBOX Err: 0.003216079026663463Prediction
During prediction, the goal is to detect all the objects of interest on the image. Below we read and resize a test image, converting it to a four-dimensional tensor that is required by convolutional layers.
using Images, DataAugmentation
img = load("../img/banana.jpg")
img_ = Image(img)
img_tensor = apply(ImageToTensor(), img_) |> itemdata
img_tensor = permutedims(img_tensor, (2,1,3))
X = Flux.unsqueeze(img_tensor, dims = 4) |> gpu;Using the multibox_detection function below, the predicted bounding boxes are obtained from the anchor boxes and their predicted offsets. Then non-maximum suppression is used to remove similar predicted bounding boxes.
function predict(model::TinySSD, X)
Flux.testmode!(model) # equivalent to net.eval()
anchors, cls_preds, bbox_preds = model(X) # assume batch-last layout
cls_probs = softmax(cls_preds; dims=1) # softmax over class dimension
output = d2lai.multibox_detection(cls_probs, bbox_preds, anchors) |> cpu
# Output is a Vector of Matrices (one per image)
preds = output[1] # predictions for first image
# Keep only non-background predictions (class_id != -1)
valid_idx = findall(preds[:, 1] .!= -1)
return preds[valid_idx, :]
end
output = predict(model, X);Finally, we display all the predicted bounding boxes with confidence 0.1 or above as output.
function display(img, output, threshold)
plt = plot(img)
for r in eachrow(output)
score = r[2]
if score > threshold
h, w = size(img)
bbox = reshape(r[3:6] .* 256, 1, 4)
plt = d2lai.show_bboxes(plt, bbox; colors = [:white])
else
continue
end
end
plt
end
display(img, output, 0.1)Summary
Single shot multibox detection is a multiscale object detection model. Via its base network and several multiscale feature map blocks, single-shot multibox detection generates a varying number of anchor boxes with different sizes, and detects varying-size objects by predicting classes and offsets of these anchor boxes (thus the bounding boxes).
When training the single-shot multibox detection model, the loss function is calculated based on the predicted and labeled values of the anchor box classes and offsets.
Exercises
- Can you improve the single-shot multibox detection by improving the loss function? For example, replace
norm loss with smooth norm loss for the predicted offsets. This loss function uses a square function around zero for smoothness, which is controlled by the hyperparameter :
When
function smooth_l1(x, scalar)
idx = abs.(x) .< (1 / (scalar ^ 2))
y = zeros(size(x))
y .= abs.(x) .- 0.5 ./ (scalar ^ 2)
x2 = (scalar .* x).^2 ./ 2.
y[idx] .= x2[idx]
y
end
sigmas = [10, 1, 0.5]
x = collect(-2:0.1:2)
plt = plot()
for sigma in sigmas
y = smooth_l1(x, sigma)
plot!(plt, x, y; label = "gamma $sigma")
end
plt<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="600" height="400" viewBox="0 0 2400 1600">
<defs>
<clipPath id="clip490">
<rect x="0" y="0" width="2400" height="1600"/>
</clipPath>
</defs>
<path clip-path="url(#clip490)" d="M0 1600 L2400 1600 L2400 0 L0 0 Z" fill="#ffffff" fill-rule="evenodd" fill-opacity="1"/>
<defs>
<clipPath id="clip491">
<rect x="480" y="0" width="1681" height="1600"/>
</clipPath>
</defs>
<path clip-path="url(#clip490)" d="M156.112 1486.45 L2352.76 1486.45 L2352.76 47.2441 L156.112 47.2441 Z" fill="#ffffff" fill-rule="evenodd" fill-opacity="1"/>
<defs>
<clipPath id="clip492">
<rect x="156" y="47" width="2198" height="1440"/>
</clipPath>
</defs>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="218.281,1486.45 218.281,47.2441 "/>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="736.358,1486.45 736.358,47.2441 "/>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="1254.43,1486.45 1254.43,47.2441 "/>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="1772.51,1486.45 1772.51,47.2441 "/>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="2290.59,1486.45 2290.59,47.2441 "/>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="156.112,1445.72 2352.76,1445.72 "/>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="156.112,1105.43 2352.76,1105.43 "/>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="156.112,765.145 2352.76,765.145 "/>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="156.112,424.859 2352.76,424.859 "/>
<polyline clip-path="url(#clip492)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:2; stroke-opacity:0.1; fill:none" points="156.112,84.5734 2352.76,84.5734 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="156.112,1486.45 2352.76,1486.45 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="218.281,1486.45 218.281,1467.55 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="736.358,1486.45 736.358,1467.55 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="1254.43,1486.45 1254.43,1467.55 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="1772.51,1486.45 1772.51,1467.55 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="2290.59,1486.45 2290.59,1467.55 "/>
<path clip-path="url(#clip490)" d="M188.224 1532.02 L217.899 1532.02 L217.899 1535.95 L188.224 1535.95 L188.224 1532.02 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M232.02 1544.91 L248.339 1544.91 L248.339 1548.85 L226.395 1548.85 L226.395 1544.91 Q229.057 1542.16 233.64 1537.53 Q238.246 1532.88 239.427 1531.53 Q241.672 1529.01 242.552 1527.27 Q243.455 1525.51 243.455 1523.82 Q243.455 1521.07 241.51 1519.33 Q239.589 1517.6 236.487 1517.6 Q234.288 1517.6 231.834 1518.36 Q229.404 1519.13 226.626 1520.68 L226.626 1515.95 Q229.45 1514.82 231.904 1514.24 Q234.358 1513.66 236.395 1513.66 Q241.765 1513.66 244.959 1516.35 Q248.154 1519.03 248.154 1523.52 Q248.154 1525.65 247.344 1527.57 Q246.557 1529.47 244.45 1532.07 Q243.871 1532.74 240.77 1535.95 Q237.668 1539.15 232.02 1544.91 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M706.115 1532.02 L735.791 1532.02 L735.791 1535.95 L706.115 1535.95 L706.115 1532.02 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M746.693 1544.91 L754.332 1544.91 L754.332 1518.55 L746.022 1520.21 L746.022 1515.95 L754.286 1514.29 L758.962 1514.29 L758.962 1544.91 L766.601 1544.91 L766.601 1548.85 L746.693 1548.85 L746.693 1544.91 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M1254.43 1517.37 Q1250.82 1517.37 1248.99 1520.93 Q1247.19 1524.47 1247.19 1531.6 Q1247.19 1538.71 1248.99 1542.27 Q1250.82 1545.82 1254.43 1545.82 Q1258.07 1545.82 1259.87 1542.27 Q1261.7 1538.71 1261.7 1531.6 Q1261.7 1524.47 1259.87 1520.93 Q1258.07 1517.37 1254.43 1517.37 M1254.43 1513.66 Q1260.24 1513.66 1263.3 1518.27 Q1266.38 1522.85 1266.38 1531.6 Q1266.38 1540.33 1263.3 1544.94 Q1260.24 1549.52 1254.43 1549.52 Q1248.62 1549.52 1245.55 1544.94 Q1242.49 1540.33 1242.49 1531.6 Q1242.49 1522.85 1245.55 1518.27 Q1248.62 1513.66 1254.43 1513.66 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M1762.89 1544.91 L1770.53 1544.91 L1770.53 1518.55 L1762.22 1520.21 L1762.22 1515.95 L1770.48 1514.29 L1775.16 1514.29 L1775.16 1544.91 L1782.8 1544.91 L1782.8 1548.85 L1762.89 1548.85 L1762.89 1544.91 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2285.24 1544.91 L2301.56 1544.91 L2301.56 1548.85 L2279.61 1548.85 L2279.61 1544.91 Q2282.28 1542.16 2286.86 1537.53 Q2291.47 1532.88 2292.65 1531.53 Q2294.89 1529.01 2295.77 1527.27 Q2296.67 1525.51 2296.67 1523.82 Q2296.67 1521.07 2294.73 1519.33 Q2292.81 1517.6 2289.71 1517.6 Q2287.51 1517.6 2285.05 1518.36 Q2282.62 1519.13 2279.85 1520.68 L2279.85 1515.95 Q2282.67 1514.82 2285.12 1514.24 Q2287.58 1513.66 2289.61 1513.66 Q2294.98 1513.66 2298.18 1516.35 Q2301.37 1519.03 2301.37 1523.52 Q2301.37 1525.65 2300.56 1527.57 Q2299.78 1529.47 2297.67 1532.07 Q2297.09 1532.74 2293.99 1535.95 Q2290.89 1539.15 2285.24 1544.91 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="156.112,1486.45 156.112,47.2441 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="156.112,1445.72 175.01,1445.72 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="156.112,1105.43 175.01,1105.43 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="156.112,765.145 175.01,765.145 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="156.112,424.859 175.01,424.859 "/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="156.112,84.5734 175.01,84.5734 "/>
<path clip-path="url(#clip490)" d="M62.9365 1431.51 Q59.3254 1431.51 57.4967 1435.08 Q55.6912 1438.62 55.6912 1445.75 Q55.6912 1452.86 57.4967 1456.42 Q59.3254 1459.96 62.9365 1459.96 Q66.5707 1459.96 68.3763 1456.42 Q70.205 1452.86 70.205 1445.75 Q70.205 1438.62 68.3763 1435.08 Q66.5707 1431.51 62.9365 1431.51 M62.9365 1427.81 Q68.7467 1427.81 71.8022 1432.42 Q74.8809 1437 74.8809 1445.75 Q74.8809 1454.48 71.8022 1459.08 Q68.7467 1463.67 62.9365 1463.67 Q57.1264 1463.67 54.0477 1459.08 Q50.9921 1454.48 50.9921 1445.75 Q50.9921 1437 54.0477 1432.42 Q57.1264 1427.81 62.9365 1427.81 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M83.0984 1457.12 L87.9827 1457.12 L87.9827 1463 L83.0984 1463 L83.0984 1457.12 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M108.168 1431.51 Q104.557 1431.51 102.728 1435.08 Q100.922 1438.62 100.922 1445.75 Q100.922 1452.86 102.728 1456.42 Q104.557 1459.96 108.168 1459.96 Q111.802 1459.96 113.608 1456.42 Q115.436 1452.86 115.436 1445.75 Q115.436 1438.62 113.608 1435.08 Q111.802 1431.51 108.168 1431.51 M108.168 1427.81 Q113.978 1427.81 117.033 1432.42 Q120.112 1437 120.112 1445.75 Q120.112 1454.48 117.033 1459.08 Q113.978 1463.67 108.168 1463.67 Q102.358 1463.67 99.2789 1459.08 Q96.2234 1454.48 96.2234 1445.75 Q96.2234 1437 99.2789 1432.42 Q102.358 1427.81 108.168 1427.81 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M63.9319 1091.23 Q60.3208 1091.23 58.4921 1094.79 Q56.6865 1098.34 56.6865 1105.46 Q56.6865 1112.57 58.4921 1116.14 Q60.3208 1119.68 63.9319 1119.68 Q67.5661 1119.68 69.3717 1116.14 Q71.2004 1112.57 71.2004 1105.46 Q71.2004 1098.34 69.3717 1094.79 Q67.5661 1091.23 63.9319 1091.23 M63.9319 1087.53 Q69.742 1087.53 72.7976 1092.13 Q75.8763 1096.71 75.8763 1105.46 Q75.8763 1114.19 72.7976 1118.8 Q69.742 1123.38 63.9319 1123.38 Q58.1217 1123.38 55.043 1118.8 Q51.9875 1114.19 51.9875 1105.46 Q51.9875 1096.71 55.043 1092.13 Q58.1217 1087.53 63.9319 1087.53 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M84.0938 1116.83 L88.978 1116.83 L88.978 1122.71 L84.0938 1122.71 L84.0938 1116.83 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M99.2095 1088.15 L117.566 1088.15 L117.566 1092.09 L103.492 1092.09 L103.492 1100.56 Q104.51 1100.21 105.529 1100.05 Q106.547 1099.86 107.566 1099.86 Q113.353 1099.86 116.733 1103.03 Q120.112 1106.21 120.112 1111.62 Q120.112 1117.2 116.64 1120.3 Q113.168 1123.38 106.848 1123.38 Q104.672 1123.38 102.404 1123.01 Q100.159 1122.64 97.7511 1121.9 L97.7511 1117.2 Q99.8345 1118.34 102.057 1118.89 Q104.279 1119.45 106.756 1119.45 Q110.76 1119.45 113.098 1117.34 Q115.436 1115.23 115.436 1111.62 Q115.436 1108.01 113.098 1105.9 Q110.76 1103.8 106.756 1103.8 Q104.881 1103.8 103.006 1104.21 Q101.154 1104.63 99.2095 1105.51 L99.2095 1088.15 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M53.7467 778.489 L61.3856 778.489 L61.3856 752.124 L53.0754 753.79 L53.0754 749.531 L61.3393 747.865 L66.0152 747.865 L66.0152 778.489 L73.654 778.489 L73.654 782.425 L53.7467 782.425 L53.7467 778.489 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M83.0984 776.545 L87.9827 776.545 L87.9827 782.425 L83.0984 782.425 L83.0984 776.545 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M108.168 750.943 Q104.557 750.943 102.728 754.508 Q100.922 758.05 100.922 765.179 Q100.922 772.286 102.728 775.851 Q104.557 779.392 108.168 779.392 Q111.802 779.392 113.608 775.851 Q115.436 772.286 115.436 765.179 Q115.436 758.05 113.608 754.508 Q111.802 750.943 108.168 750.943 M108.168 747.24 Q113.978 747.24 117.033 751.846 Q120.112 756.429 120.112 765.179 Q120.112 773.906 117.033 778.513 Q113.978 783.096 108.168 783.096 Q102.358 783.096 99.2789 778.513 Q96.2234 773.906 96.2234 765.179 Q96.2234 756.429 99.2789 751.846 Q102.358 747.24 108.168 747.24 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M54.7421 438.204 L62.381 438.204 L62.381 411.838 L54.0708 413.505 L54.0708 409.246 L62.3347 407.579 L67.0106 407.579 L67.0106 438.204 L74.6494 438.204 L74.6494 442.139 L54.7421 442.139 L54.7421 438.204 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M84.0938 436.259 L88.978 436.259 L88.978 442.139 L84.0938 442.139 L84.0938 436.259 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M99.2095 407.579 L117.566 407.579 L117.566 411.514 L103.492 411.514 L103.492 419.986 Q104.51 419.639 105.529 419.477 Q106.547 419.292 107.566 419.292 Q113.353 419.292 116.733 422.463 Q120.112 425.634 120.112 431.051 Q120.112 436.63 116.64 439.732 Q113.168 442.81 106.848 442.81 Q104.672 442.81 102.404 442.44 Q100.159 442.07 97.7511 441.329 L97.7511 436.63 Q99.8345 437.764 102.057 438.32 Q104.279 438.875 106.756 438.875 Q110.76 438.875 113.098 436.769 Q115.436 434.662 115.436 431.051 Q115.436 427.44 113.098 425.334 Q110.76 423.227 106.756 423.227 Q104.881 423.227 103.006 423.644 Q101.154 424.06 99.2095 424.94 L99.2095 407.579 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M56.9643 97.9183 L73.2837 97.9183 L73.2837 101.853 L51.3393 101.853 L51.3393 97.9183 Q54.0014 95.1636 58.5847 90.534 Q63.1911 85.8813 64.3717 84.5387 Q66.617 82.0156 67.4967 80.2795 Q68.3994 78.5202 68.3994 76.8304 Q68.3994 74.0758 66.455 72.3397 Q64.5337 70.6036 61.4319 70.6036 Q59.2328 70.6036 56.7791 71.3675 Q54.3486 72.1314 51.5708 73.6823 L51.5708 68.9601 Q54.3949 67.8258 56.8486 67.2471 Q59.3023 66.6684 61.3393 66.6684 Q66.7096 66.6684 69.9041 69.3536 Q73.0985 72.0388 73.0985 76.5295 Q73.0985 78.6591 72.2883 80.5804 Q71.5013 82.4785 69.3948 85.0711 Q68.8161 85.7424 65.7143 88.96 Q62.6124 92.1544 56.9643 97.9183 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M83.0984 95.9738 L87.9827 95.9738 L87.9827 101.853 L83.0984 101.853 L83.0984 95.9738 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M108.168 70.3721 Q104.557 70.3721 102.728 73.9369 Q100.922 77.4786 100.922 84.6081 Q100.922 91.7146 102.728 95.2794 Q104.557 98.821 108.168 98.821 Q111.802 98.821 113.608 95.2794 Q115.436 91.7146 115.436 84.6081 Q115.436 77.4786 113.608 73.9369 Q111.802 70.3721 108.168 70.3721 M108.168 66.6684 Q113.978 66.6684 117.033 71.2749 Q120.112 75.8582 120.112 84.6081 Q120.112 93.3349 117.033 97.9414 Q113.978 102.525 108.168 102.525 Q102.358 102.525 99.2789 97.9414 Q96.2234 93.3349 96.2234 84.6081 Q96.2234 75.8582 99.2789 71.2749 Q102.358 66.6684 108.168 66.6684 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><polyline clip-path="url(#clip492)" style="stroke:#009af9; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" stroke-dasharray="2, 4" points="218.281,87.9763 270.089,156.033 321.897,224.091 373.704,292.148 425.512,360.205 477.319,428.262 529.127,496.319 580.935,564.376 632.742,632.433 684.55,700.49 736.358,768.547 788.165,836.605 839.973,904.662 891.781,972.719 943.588,1040.78 995.396,1108.83 1047.2,1176.89 1099.01,1244.95 1150.82,1313 1202.63,1381.06 1254.43,1445.72 1306.24,1381.06 1358.05,1313 1409.86,1244.95 1461.66,1176.89 1513.47,1108.83 1565.28,1040.78 1617.09,972.719 1668.9,904.662 1720.7,836.605 1772.51,768.547 1824.32,700.49 1876.13,632.433 1927.93,564.376 1979.74,496.319 2031.55,428.262 2083.36,360.205 2135.16,292.148 2186.97,224.091 2238.78,156.033 2290.59,87.9763 "/>
<polyline clip-path="url(#clip492)" style="stroke:#e26f46; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" stroke-dasharray="2, 4" points="218.281,424.859 270.089,492.916 321.897,560.973 373.704,629.03 425.512,697.087 477.319,765.145 529.127,833.202 580.935,901.259 632.742,969.316 684.55,1037.37 736.358,1105.43 788.165,1170.08 839.973,1227.93 891.781,1278.98 943.588,1323.21 995.396,1360.64 1047.2,1391.27 1099.01,1415.09 1150.82,1432.1 1202.63,1442.31 1254.43,1445.72 1306.24,1442.31 1358.05,1432.1 1409.86,1415.09 1461.66,1391.27 1513.47,1360.64 1565.28,1323.21 1617.09,1278.98 1668.9,1227.93 1720.7,1170.08 1772.51,1105.43 1824.32,1037.37 1876.13,969.316 1927.93,901.259 1979.74,833.202 2031.55,765.145 2083.36,697.087 2135.16,629.03 2186.97,560.973 2238.78,492.916 2290.59,424.859 "/>
<polyline clip-path="url(#clip492)" style="stroke:#3da44d; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" stroke-dasharray="2, 4" points="218.281,1105.43 270.089,1138.61 321.897,1170.08 373.704,1199.86 425.512,1227.93 477.319,1254.31 529.127,1278.98 580.935,1301.95 632.742,1323.21 684.55,1342.78 736.358,1360.64 788.165,1376.81 839.973,1391.27 891.781,1404.03 943.588,1415.09 995.396,1424.45 1047.2,1432.1 1099.01,1438.06 1150.82,1442.31 1202.63,1444.86 1254.43,1445.72 1306.24,1444.86 1358.05,1442.31 1409.86,1438.06 1461.66,1432.1 1513.47,1424.45 1565.28,1415.09 1617.09,1404.03 1668.9,1391.27 1720.7,1376.81 1772.51,1360.64 1824.32,1342.78 1876.13,1323.21 1927.93,1301.95 1979.74,1278.98 2031.55,1254.31 2083.36,1227.93 2135.16,1199.86 2186.97,1170.08 2238.78,1138.61 2290.59,1105.43 "/>
<path clip-path="url(#clip490)" d="M1764.45 1438.47 L2279.53 1438.47 L2279.53 1231.11 L1764.45 1231.11 Z" fill="#ffffff" fill-rule="evenodd" fill-opacity="1"/>
<polyline clip-path="url(#clip490)" style="stroke:#000000; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" points="1764.45,1438.47 2279.53,1438.47 2279.53,1231.11 1764.45,1231.11 1764.45,1438.47 "/>
<polyline clip-path="url(#clip490)" style="stroke:#009af9; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" stroke-dasharray="2, 4" points="1788.86,1282.95 1935.31,1282.95 "/>
<path clip-path="url(#clip490)" d="M1978.62 1286.97 Q1978.62 1282.34 1976.7 1279.79 Q1974.8 1277.25 1971.36 1277.25 Q1967.93 1277.25 1966.01 1279.79 Q1964.11 1282.34 1964.11 1286.97 Q1964.11 1291.58 1966.01 1294.12 Q1967.93 1296.67 1971.36 1296.67 Q1974.8 1296.67 1976.7 1294.12 Q1978.62 1291.58 1978.62 1286.97 M1982.88 1297.02 Q1982.88 1303.64 1979.94 1306.85 Q1977 1310.1 1970.94 1310.1 Q1968.69 1310.1 1966.7 1309.75 Q1964.71 1309.42 1962.84 1308.73 L1962.84 1304.59 Q1964.71 1305.6 1966.54 1306.09 Q1968.37 1306.58 1970.27 1306.58 Q1974.46 1306.58 1976.54 1304.38 Q1978.62 1302.2 1978.62 1297.78 L1978.62 1295.67 Q1977.3 1297.97 1975.24 1299.1 Q1973.18 1300.23 1970.31 1300.23 Q1965.55 1300.23 1962.63 1296.6 Q1959.71 1292.97 1959.71 1286.97 Q1959.71 1280.95 1962.63 1277.32 Q1965.55 1273.68 1970.31 1273.68 Q1973.18 1273.68 1975.24 1274.82 Q1977.3 1275.95 1978.62 1278.24 L1978.62 1274.31 L1982.88 1274.31 L1982.88 1297.02 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2003.44 1287.2 Q1998.28 1287.2 1996.29 1288.38 Q1994.3 1289.56 1994.3 1292.41 Q1994.3 1294.68 1995.78 1296.02 Q1997.28 1297.34 1999.85 1297.34 Q2003.39 1297.34 2005.52 1294.84 Q2007.67 1292.32 2007.67 1288.15 L2007.67 1287.2 L2003.44 1287.2 M2011.93 1285.44 L2011.93 1300.23 L2007.67 1300.23 L2007.67 1296.3 Q2006.22 1298.66 2004.04 1299.79 Q2001.86 1300.91 1998.72 1300.91 Q1994.74 1300.91 1992.37 1298.68 Q1990.04 1296.44 1990.04 1292.69 Q1990.04 1288.31 1992.95 1286.09 Q1995.89 1283.87 2001.7 1283.87 L2007.67 1283.87 L2007.67 1283.45 Q2007.67 1280.51 2005.73 1278.92 Q2003.81 1277.29 2000.31 1277.29 Q1998.09 1277.29 1995.99 1277.83 Q1993.88 1278.36 1991.93 1279.42 L1991.93 1275.49 Q1994.27 1274.59 1996.47 1274.15 Q1998.67 1273.68 2000.75 1273.68 Q2006.38 1273.68 2009.16 1276.6 Q2011.93 1279.52 2011.93 1285.44 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2040.89 1279.29 Q2042.49 1276.42 2044.71 1275.05 Q2046.93 1273.68 2049.94 1273.68 Q2053.99 1273.68 2056.19 1276.53 Q2058.39 1279.35 2058.39 1284.59 L2058.39 1300.23 L2054.11 1300.23 L2054.11 1284.73 Q2054.11 1281 2052.79 1279.19 Q2051.47 1277.39 2048.76 1277.39 Q2045.45 1277.39 2043.53 1279.59 Q2041.61 1281.79 2041.61 1285.58 L2041.61 1300.23 L2037.33 1300.23 L2037.33 1284.73 Q2037.33 1280.98 2036.01 1279.19 Q2034.69 1277.39 2031.93 1277.39 Q2028.67 1277.39 2026.75 1279.61 Q2024.83 1281.81 2024.83 1285.58 L2024.83 1300.23 L2020.55 1300.23 L2020.55 1274.31 L2024.83 1274.31 L2024.83 1278.34 Q2026.29 1275.95 2028.32 1274.82 Q2030.36 1273.68 2033.16 1273.68 Q2035.98 1273.68 2037.95 1275.12 Q2039.94 1276.55 2040.89 1279.29 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2087.07 1279.29 Q2088.67 1276.42 2090.89 1275.05 Q2093.11 1273.68 2096.12 1273.68 Q2100.17 1273.68 2102.37 1276.53 Q2104.57 1279.35 2104.57 1284.59 L2104.57 1300.23 L2100.29 1300.23 L2100.29 1284.73 Q2100.29 1281 2098.97 1279.19 Q2097.65 1277.39 2094.94 1277.39 Q2091.63 1277.39 2089.71 1279.59 Q2087.79 1281.79 2087.79 1285.58 L2087.79 1300.23 L2083.51 1300.23 L2083.51 1284.73 Q2083.51 1280.98 2082.19 1279.19 Q2080.87 1277.39 2078.11 1277.39 Q2074.85 1277.39 2072.93 1279.61 Q2071.01 1281.81 2071.01 1285.58 L2071.01 1300.23 L2066.73 1300.23 L2066.73 1274.31 L2071.01 1274.31 L2071.01 1278.34 Q2072.47 1275.95 2074.5 1274.82 Q2076.54 1273.68 2079.34 1273.68 Q2082.17 1273.68 2084.13 1275.12 Q2086.12 1276.55 2087.07 1279.29 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2124.85 1287.2 Q2119.69 1287.2 2117.7 1288.38 Q2115.71 1289.56 2115.71 1292.41 Q2115.71 1294.68 2117.19 1296.02 Q2118.69 1297.34 2121.26 1297.34 Q2124.8 1297.34 2126.93 1294.84 Q2129.09 1292.32 2129.09 1288.15 L2129.09 1287.2 L2124.85 1287.2 M2133.35 1285.44 L2133.35 1300.23 L2129.09 1300.23 L2129.09 1296.3 Q2127.63 1298.66 2125.45 1299.79 Q2123.28 1300.91 2120.13 1300.91 Q2116.15 1300.91 2113.79 1298.68 Q2111.45 1296.44 2111.45 1292.69 Q2111.45 1288.31 2114.36 1286.09 Q2117.3 1283.87 2123.11 1283.87 L2129.09 1283.87 L2129.09 1283.45 Q2129.09 1280.51 2127.14 1278.92 Q2125.22 1277.29 2121.73 1277.29 Q2119.5 1277.29 2117.4 1277.83 Q2115.29 1278.36 2113.35 1279.42 L2113.35 1275.49 Q2115.68 1274.59 2117.88 1274.15 Q2120.08 1273.68 2122.17 1273.68 Q2127.79 1273.68 2130.57 1276.6 Q2133.35 1279.52 2133.35 1285.44 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2158.6 1296.3 L2166.24 1296.3 L2166.24 1269.93 L2157.93 1271.6 L2157.93 1267.34 L2166.19 1265.67 L2170.87 1265.67 L2170.87 1296.3 L2178.51 1296.3 L2178.51 1300.23 L2158.6 1300.23 L2158.6 1296.3 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2197.95 1268.75 Q2194.34 1268.75 2192.51 1272.32 Q2190.71 1275.86 2190.71 1282.99 Q2190.71 1290.1 2192.51 1293.66 Q2194.34 1297.2 2197.95 1297.2 Q2201.59 1297.2 2203.39 1293.66 Q2205.22 1290.1 2205.22 1282.99 Q2205.22 1275.86 2203.39 1272.32 Q2201.59 1268.75 2197.95 1268.75 M2197.95 1265.05 Q2203.76 1265.05 2206.82 1269.66 Q2209.9 1274.24 2209.9 1282.99 Q2209.9 1291.72 2206.82 1296.32 Q2203.76 1300.91 2197.95 1300.91 Q2192.14 1300.91 2189.06 1296.32 Q2186.01 1291.72 2186.01 1282.99 Q2186.01 1274.24 2189.06 1269.66 Q2192.14 1265.05 2197.95 1265.05 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2218.11 1294.35 L2223 1294.35 L2223 1300.23 L2218.11 1300.23 L2218.11 1294.35 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2243.18 1268.75 Q2239.57 1268.75 2237.74 1272.32 Q2235.94 1275.86 2235.94 1282.99 Q2235.94 1290.1 2237.74 1293.66 Q2239.57 1297.2 2243.18 1297.2 Q2246.82 1297.2 2248.62 1293.66 Q2250.45 1290.1 2250.45 1282.99 Q2250.45 1275.86 2248.62 1272.32 Q2246.82 1268.75 2243.18 1268.75 M2243.18 1265.05 Q2248.99 1265.05 2252.05 1269.66 Q2255.13 1274.24 2255.13 1282.99 Q2255.13 1291.72 2252.05 1296.32 Q2248.99 1300.91 2243.18 1300.91 Q2237.37 1300.91 2234.29 1296.32 Q2231.24 1291.72 2231.24 1282.99 Q2231.24 1274.24 2234.29 1269.66 Q2237.37 1265.05 2243.18 1265.05 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><polyline clip-path="url(#clip490)" style="stroke:#e26f46; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" stroke-dasharray="2, 4" points="1788.86,1334.79 1935.31,1334.79 "/>
<path clip-path="url(#clip490)" d="M1978.62 1338.81 Q1978.62 1334.18 1976.7 1331.63 Q1974.8 1329.09 1971.36 1329.09 Q1967.93 1329.09 1966.01 1331.63 Q1964.11 1334.18 1964.11 1338.81 Q1964.11 1343.42 1966.01 1345.96 Q1967.93 1348.51 1971.36 1348.51 Q1974.8 1348.51 1976.7 1345.96 Q1978.62 1343.42 1978.62 1338.81 M1982.88 1348.86 Q1982.88 1355.48 1979.94 1358.69 Q1977 1361.94 1970.94 1361.94 Q1968.69 1361.94 1966.7 1361.59 Q1964.71 1361.26 1962.84 1360.57 L1962.84 1356.43 Q1964.71 1357.44 1966.54 1357.93 Q1968.37 1358.42 1970.27 1358.42 Q1974.46 1358.42 1976.54 1356.22 Q1978.62 1354.04 1978.62 1349.62 L1978.62 1347.51 Q1977.3 1349.81 1975.24 1350.94 Q1973.18 1352.07 1970.31 1352.07 Q1965.55 1352.07 1962.63 1348.44 Q1959.71 1344.81 1959.71 1338.81 Q1959.71 1332.79 1962.63 1329.16 Q1965.55 1325.52 1970.31 1325.52 Q1973.18 1325.52 1975.24 1326.66 Q1977.3 1327.79 1978.62 1330.08 L1978.62 1326.15 L1982.88 1326.15 L1982.88 1348.86 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2003.44 1339.04 Q1998.28 1339.04 1996.29 1340.22 Q1994.3 1341.4 1994.3 1344.25 Q1994.3 1346.52 1995.78 1347.86 Q1997.28 1349.18 1999.85 1349.18 Q2003.39 1349.18 2005.52 1346.68 Q2007.67 1344.16 2007.67 1339.99 L2007.67 1339.04 L2003.44 1339.04 M2011.93 1337.28 L2011.93 1352.07 L2007.67 1352.07 L2007.67 1348.14 Q2006.22 1350.5 2004.04 1351.63 Q2001.86 1352.75 1998.72 1352.75 Q1994.74 1352.75 1992.37 1350.52 Q1990.04 1348.28 1990.04 1344.53 Q1990.04 1340.15 1992.95 1337.93 Q1995.89 1335.71 2001.7 1335.71 L2007.67 1335.71 L2007.67 1335.29 Q2007.67 1332.35 2005.73 1330.76 Q2003.81 1329.13 2000.31 1329.13 Q1998.09 1329.13 1995.99 1329.67 Q1993.88 1330.2 1991.93 1331.26 L1991.93 1327.33 Q1994.27 1326.43 1996.47 1325.99 Q1998.67 1325.52 2000.75 1325.52 Q2006.38 1325.52 2009.16 1328.44 Q2011.93 1331.36 2011.93 1337.28 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2040.89 1331.13 Q2042.49 1328.26 2044.71 1326.89 Q2046.93 1325.52 2049.94 1325.52 Q2053.99 1325.52 2056.19 1328.37 Q2058.39 1331.19 2058.39 1336.43 L2058.39 1352.07 L2054.11 1352.07 L2054.11 1336.57 Q2054.11 1332.84 2052.79 1331.03 Q2051.47 1329.23 2048.76 1329.23 Q2045.45 1329.23 2043.53 1331.43 Q2041.61 1333.63 2041.61 1337.42 L2041.61 1352.07 L2037.33 1352.07 L2037.33 1336.57 Q2037.33 1332.82 2036.01 1331.03 Q2034.69 1329.23 2031.93 1329.23 Q2028.67 1329.23 2026.75 1331.45 Q2024.83 1333.65 2024.83 1337.42 L2024.83 1352.07 L2020.55 1352.07 L2020.55 1326.15 L2024.83 1326.15 L2024.83 1330.18 Q2026.29 1327.79 2028.32 1326.66 Q2030.36 1325.52 2033.16 1325.52 Q2035.98 1325.52 2037.95 1326.96 Q2039.94 1328.39 2040.89 1331.13 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2087.07 1331.13 Q2088.67 1328.26 2090.89 1326.89 Q2093.11 1325.52 2096.12 1325.52 Q2100.17 1325.52 2102.37 1328.37 Q2104.57 1331.19 2104.57 1336.43 L2104.57 1352.07 L2100.29 1352.07 L2100.29 1336.57 Q2100.29 1332.84 2098.97 1331.03 Q2097.65 1329.23 2094.94 1329.23 Q2091.63 1329.23 2089.71 1331.43 Q2087.79 1333.63 2087.79 1337.42 L2087.79 1352.07 L2083.51 1352.07 L2083.51 1336.57 Q2083.51 1332.82 2082.19 1331.03 Q2080.87 1329.23 2078.11 1329.23 Q2074.85 1329.23 2072.93 1331.45 Q2071.01 1333.65 2071.01 1337.42 L2071.01 1352.07 L2066.73 1352.07 L2066.73 1326.15 L2071.01 1326.15 L2071.01 1330.18 Q2072.47 1327.79 2074.5 1326.66 Q2076.54 1325.52 2079.34 1325.52 Q2082.17 1325.52 2084.13 1326.96 Q2086.12 1328.39 2087.07 1331.13 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2124.85 1339.04 Q2119.69 1339.04 2117.7 1340.22 Q2115.71 1341.4 2115.71 1344.25 Q2115.71 1346.52 2117.19 1347.86 Q2118.69 1349.18 2121.26 1349.18 Q2124.8 1349.18 2126.93 1346.68 Q2129.09 1344.16 2129.09 1339.99 L2129.09 1339.04 L2124.85 1339.04 M2133.35 1337.28 L2133.35 1352.07 L2129.09 1352.07 L2129.09 1348.14 Q2127.63 1350.5 2125.45 1351.63 Q2123.28 1352.75 2120.13 1352.75 Q2116.15 1352.75 2113.79 1350.52 Q2111.45 1348.28 2111.45 1344.53 Q2111.45 1340.15 2114.36 1337.93 Q2117.3 1335.71 2123.11 1335.71 L2129.09 1335.71 L2129.09 1335.29 Q2129.09 1332.35 2127.14 1330.76 Q2125.22 1329.13 2121.73 1329.13 Q2119.5 1329.13 2117.4 1329.67 Q2115.29 1330.2 2113.35 1331.26 L2113.35 1327.33 Q2115.68 1326.43 2117.88 1325.99 Q2120.08 1325.52 2122.17 1325.52 Q2127.79 1325.52 2130.57 1328.44 Q2133.35 1331.36 2133.35 1337.28 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2158.6 1348.14 L2166.24 1348.14 L2166.24 1321.77 L2157.93 1323.44 L2157.93 1319.18 L2166.19 1317.51 L2170.87 1317.51 L2170.87 1348.14 L2178.51 1348.14 L2178.51 1352.07 L2158.6 1352.07 L2158.6 1348.14 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2187.95 1346.19 L2192.84 1346.19 L2192.84 1352.07 L2187.95 1352.07 L2187.95 1346.19 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2213.02 1320.59 Q2209.41 1320.59 2207.58 1324.16 Q2205.78 1327.7 2205.78 1334.83 Q2205.78 1341.94 2207.58 1345.5 Q2209.41 1349.04 2213.02 1349.04 Q2216.66 1349.04 2218.46 1345.5 Q2220.29 1341.94 2220.29 1334.83 Q2220.29 1327.7 2218.46 1324.16 Q2216.66 1320.59 2213.02 1320.59 M2213.02 1316.89 Q2218.83 1316.89 2221.89 1321.5 Q2224.97 1326.08 2224.97 1334.83 Q2224.97 1343.56 2221.89 1348.16 Q2218.83 1352.75 2213.02 1352.75 Q2207.21 1352.75 2204.13 1348.16 Q2201.08 1343.56 2201.08 1334.83 Q2201.08 1326.08 2204.13 1321.5 Q2207.21 1316.89 2213.02 1316.89 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><polyline clip-path="url(#clip490)" style="stroke:#3da44d; stroke-linecap:round; stroke-linejoin:round; stroke-width:4; stroke-opacity:1; fill:none" stroke-dasharray="2, 4" points="1788.86,1386.63 1935.31,1386.63 "/>
<path clip-path="url(#clip490)" d="M1978.62 1390.65 Q1978.62 1386.02 1976.7 1383.47 Q1974.8 1380.93 1971.36 1380.93 Q1967.93 1380.93 1966.01 1383.47 Q1964.11 1386.02 1964.11 1390.65 Q1964.11 1395.26 1966.01 1397.8 Q1967.93 1400.35 1971.36 1400.35 Q1974.8 1400.35 1976.7 1397.8 Q1978.62 1395.26 1978.62 1390.65 M1982.88 1400.7 Q1982.88 1407.32 1979.94 1410.53 Q1977 1413.78 1970.94 1413.78 Q1968.69 1413.78 1966.7 1413.43 Q1964.71 1413.1 1962.84 1412.41 L1962.84 1408.27 Q1964.71 1409.28 1966.54 1409.77 Q1968.37 1410.26 1970.27 1410.26 Q1974.46 1410.26 1976.54 1408.06 Q1978.62 1405.88 1978.62 1401.46 L1978.62 1399.35 Q1977.3 1401.65 1975.24 1402.78 Q1973.18 1403.91 1970.31 1403.91 Q1965.55 1403.91 1962.63 1400.28 Q1959.71 1396.65 1959.71 1390.65 Q1959.71 1384.63 1962.63 1381 Q1965.55 1377.36 1970.31 1377.36 Q1973.18 1377.36 1975.24 1378.5 Q1977.3 1379.63 1978.62 1381.92 L1978.62 1377.99 L1982.88 1377.99 L1982.88 1400.7 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2003.44 1390.88 Q1998.28 1390.88 1996.29 1392.06 Q1994.3 1393.24 1994.3 1396.09 Q1994.3 1398.36 1995.78 1399.7 Q1997.28 1401.02 1999.85 1401.02 Q2003.39 1401.02 2005.52 1398.52 Q2007.67 1396 2007.67 1391.83 L2007.67 1390.88 L2003.44 1390.88 M2011.93 1389.12 L2011.93 1403.91 L2007.67 1403.91 L2007.67 1399.98 Q2006.22 1402.34 2004.04 1403.47 Q2001.86 1404.59 1998.72 1404.59 Q1994.74 1404.59 1992.37 1402.36 Q1990.04 1400.12 1990.04 1396.37 Q1990.04 1391.99 1992.95 1389.77 Q1995.89 1387.55 2001.7 1387.55 L2007.67 1387.55 L2007.67 1387.13 Q2007.67 1384.19 2005.73 1382.6 Q2003.81 1380.97 2000.31 1380.97 Q1998.09 1380.97 1995.99 1381.51 Q1993.88 1382.04 1991.93 1383.1 L1991.93 1379.17 Q1994.27 1378.27 1996.47 1377.83 Q1998.67 1377.36 2000.75 1377.36 Q2006.38 1377.36 2009.16 1380.28 Q2011.93 1383.2 2011.93 1389.12 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2040.89 1382.97 Q2042.49 1380.1 2044.71 1378.73 Q2046.93 1377.36 2049.94 1377.36 Q2053.99 1377.36 2056.19 1380.21 Q2058.39 1383.03 2058.39 1388.27 L2058.39 1403.91 L2054.11 1403.91 L2054.11 1388.41 Q2054.11 1384.68 2052.79 1382.87 Q2051.47 1381.07 2048.76 1381.07 Q2045.45 1381.07 2043.53 1383.27 Q2041.61 1385.47 2041.61 1389.26 L2041.61 1403.91 L2037.33 1403.91 L2037.33 1388.41 Q2037.33 1384.66 2036.01 1382.87 Q2034.69 1381.07 2031.93 1381.07 Q2028.67 1381.07 2026.75 1383.29 Q2024.83 1385.49 2024.83 1389.26 L2024.83 1403.91 L2020.55 1403.91 L2020.55 1377.99 L2024.83 1377.99 L2024.83 1382.02 Q2026.29 1379.63 2028.32 1378.5 Q2030.36 1377.36 2033.16 1377.36 Q2035.98 1377.36 2037.95 1378.8 Q2039.94 1380.23 2040.89 1382.97 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2087.07 1382.97 Q2088.67 1380.1 2090.89 1378.73 Q2093.11 1377.36 2096.12 1377.36 Q2100.17 1377.36 2102.37 1380.21 Q2104.57 1383.03 2104.57 1388.27 L2104.57 1403.91 L2100.29 1403.91 L2100.29 1388.41 Q2100.29 1384.68 2098.97 1382.87 Q2097.65 1381.07 2094.94 1381.07 Q2091.63 1381.07 2089.71 1383.27 Q2087.79 1385.47 2087.79 1389.26 L2087.79 1403.91 L2083.51 1403.91 L2083.51 1388.41 Q2083.51 1384.66 2082.19 1382.87 Q2080.87 1381.07 2078.11 1381.07 Q2074.85 1381.07 2072.93 1383.29 Q2071.01 1385.49 2071.01 1389.26 L2071.01 1403.91 L2066.73 1403.91 L2066.73 1377.99 L2071.01 1377.99 L2071.01 1382.02 Q2072.47 1379.63 2074.5 1378.5 Q2076.54 1377.36 2079.34 1377.36 Q2082.17 1377.36 2084.13 1378.8 Q2086.12 1380.23 2087.07 1382.97 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2124.85 1390.88 Q2119.69 1390.88 2117.7 1392.06 Q2115.71 1393.24 2115.71 1396.09 Q2115.71 1398.36 2117.19 1399.7 Q2118.69 1401.02 2121.26 1401.02 Q2124.8 1401.02 2126.93 1398.52 Q2129.09 1396 2129.09 1391.83 L2129.09 1390.88 L2124.85 1390.88 M2133.35 1389.12 L2133.35 1403.91 L2129.09 1403.91 L2129.09 1399.98 Q2127.63 1402.34 2125.45 1403.47 Q2123.28 1404.59 2120.13 1404.59 Q2116.15 1404.59 2113.79 1402.36 Q2111.45 1400.12 2111.45 1396.37 Q2111.45 1391.99 2114.36 1389.77 Q2117.3 1387.55 2123.11 1387.55 L2129.09 1387.55 L2129.09 1387.13 Q2129.09 1384.19 2127.14 1382.6 Q2125.22 1380.97 2121.73 1380.97 Q2119.5 1380.97 2117.4 1381.51 Q2115.29 1382.04 2113.35 1383.1 L2113.35 1379.17 Q2115.68 1378.27 2117.88 1377.83 Q2120.08 1377.36 2122.17 1377.36 Q2127.79 1377.36 2130.57 1380.28 Q2133.35 1383.2 2133.35 1389.12 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2167.79 1372.43 Q2164.18 1372.43 2162.35 1376 Q2160.54 1379.54 2160.54 1386.67 Q2160.54 1393.78 2162.35 1397.34 Q2164.18 1400.88 2167.79 1400.88 Q2171.42 1400.88 2173.23 1397.34 Q2175.06 1393.78 2175.06 1386.67 Q2175.06 1379.54 2173.23 1376 Q2171.42 1372.43 2167.79 1372.43 M2167.79 1368.73 Q2173.6 1368.73 2176.66 1373.34 Q2179.73 1377.92 2179.73 1386.67 Q2179.73 1395.4 2176.66 1400 Q2173.6 1404.59 2167.79 1404.59 Q2161.98 1404.59 2158.9 1400 Q2155.85 1395.4 2155.85 1386.67 Q2155.85 1377.92 2158.9 1373.34 Q2161.98 1368.73 2167.79 1368.73 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2187.95 1398.03 L2192.84 1398.03 L2192.84 1403.91 L2187.95 1403.91 L2187.95 1398.03 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /><path clip-path="url(#clip490)" d="M2203.07 1369.35 L2221.42 1369.35 L2221.42 1373.29 L2207.35 1373.29 L2207.35 1381.76 Q2208.37 1381.41 2209.39 1381.25 Q2210.41 1381.07 2211.42 1381.07 Q2217.21 1381.07 2220.59 1384.24 Q2223.97 1387.41 2223.97 1392.83 Q2223.97 1398.41 2220.5 1401.51 Q2217.03 1404.59 2210.71 1404.59 Q2208.53 1404.59 2206.26 1404.22 Q2204.02 1403.84 2201.61 1403.1 L2201.61 1398.41 Q2203.69 1399.54 2205.91 1400.09 Q2208.14 1400.65 2210.61 1400.65 Q2214.62 1400.65 2216.96 1398.54 Q2219.29 1396.44 2219.29 1392.83 Q2219.29 1389.22 2216.96 1387.11 Q2214.62 1385 2210.61 1385 Q2208.74 1385 2206.86 1385.42 Q2205.01 1385.84 2203.07 1386.72 L2203.07 1369.35 Z" fill="#000000" fill-rule="nonzero" fill-opacity="1" /></svg>Besides, in the experiment we used cross-entropy loss for class prediction: denoting by
As we can see, increasing
function focal_loss(gamma, X)
return (-1 .* (1 .- x) .^ gamma).*log.(x)
end
x = 0.01:0.01:1 |> collect
plt = plot()
for gamma in [0, 1, 5]
y = focal_loss(gamma, x)
plot!(plt, x, y; label = "gamma $gamma")
end
pltDue to space limitations, we have omitted some implementation details of the single shot multibox detection model in this section. Can you further improve the model in the following aspects:
When an object is much smaller compared with the image, the model could resize the input image bigger.
There are typically a vast number of negative anchor boxes. To make the class distribution more balanced, we could downsample negative anchor boxes.
In the loss function, assign different weight hyperparameters to the class loss and the offset loss.
Use other methods to evaluate the object detection model, such as those in the single shot multibox detection paper [203].