Upload Date | May 18 2024 09:22 AM |
Views | 3 |
System Information | |
---|---|
Operating System | iOS 17.5 |
Model | iPad (9th generation) |
Model ID | iPad12,1 |
Motherboard | J181AP |
CPU Information | |
---|---|
Name | Apple A13 Bionic |
Topology | 1 Processor, 6 Cores |
Identifier | ARM |
Base Frequency | 2.66 GHz |
Cluster 1 | 2 Cores |
Cluster 2 | 4 Cores |
L1 Instruction Cache | 96.0 KB x 1 |
L1 Data Cache | 48.0 KB x 1 |
L2 Cache | 4.00 MB x 1 |
Memory Information | |
---|---|
Size | 2.87 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | Neural Engine |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
278
52.0 IPS |
|
Image Classification (F16)
|
100% |
2675
500.6 IPS |
|
Image Classification (I8)
|
99% |
2637
493.5 IPS |
|
Image Segmentation (F32)
|
100% |
219
3.66 IPS |
|
Image Segmentation (F16)
|
100% |
331
5.53 IPS |
|
Image Segmentation (I8)
|
100% |
331
5.53 IPS |
|
Pose Estimation (F32)
|
100% |
2259
2.74 IPS |
|
Pose Estimation (F16)
|
100% |
25695
31.1 IPS |
|
Pose Estimation (I8)
|
100% |
25622
31.0 IPS |
|
Object Detection (F32)
|
100% |
402
30.0 IPS |
|
Object Detection (F16)
|
100% |
1095
81.8 IPS |
|
Object Detection (I8)
|
97% |
497
37.1 IPS |
|
Face Detection (F32)
|
100% |
2380
28.3 IPS |
|
Face Detection (F16)
|
98% |
4890
58.1 IPS |
|
Face Detection (I8)
|
97% |
5124
60.9 IPS |
|
Depth Estimation (F32)
|
100% |
2472
19.2 IPS |
|
Depth Estimation (F16)
|
100% |
20586
159.6 IPS |
|
Depth Estimation (I8)
|
99% |
21597
167.5 IPS |
|
Style Transfer (F32)
|
100% |
5887
7.74 IPS |
|
Style Transfer (F16)
|
100% |
18071
23.8 IPS |
|
Style Transfer (I8)
|
100% |
18298
24.1 IPS |
|
Image Super-Resolution (F32)
|
100% |
1036
37.0 IPS |
|
Image Super-Resolution (F16)
|
100% |
4803
171.5 IPS |
|
Image Super-Resolution (I8)
|
100% |
5082
181.5 IPS |
|
Text Classification (F32)
|
100% |
231
331.7 IPS |
|
Text Classification (F16)
|
100% |
295
424.1 IPS |
|
Text Classification (I8)
|
96% |
296
425.9 IPS |
|
Machine Translation (F32)
|
100% |
208
3.82 IPS |
|
Machine Translation (F16)
|
100% |
203
3.74 IPS |
|
Machine Translation (I8)
|
99% |
218
4.01 IPS |