phanerozoic / cofiber-detection

huggingface.co
Total runs: 576
24-hour runs: 0
7-day runs: 94
30-day runs: 259
Model's Last Updated: April 22 2026
object-detection

Introduction of cofiber-detection

Model Details of cofiber-detection

Cofiber Detection

Object detection heads built on cofiber decomposition of frozen EUPE-ViT-B features. The cofiber decomposition produces multi-scale representations with zero learned parameters, replacing the 11M-parameter FPN typically used in FCOS-style detectors. Heads range from 70-parameter analytical constructions to 3.85M-parameter trained networks, evaluated on COCO val2017.

The Cofiber Decomposition

Given spatial backbone features f : [768, H, W] , the cofiber decomposition produces n scale bands via iterated subtraction of downsampled-then-upsampled content:

residual = f
for k = 0 to n-2:
    omega_k = avgpool(residual, 2)
    sigma_omega_k = upsample_bilinear(omega_k, size=residual.shape)
    cofiber_k = residual - sigma_omega_k
    residual = omega_k
cofiber_{n-1} = residual

Each cofiber_k captures frequency content at a distinct scale with no cross-scale interference. The decomposition is a fixed two-line operation, yet it provides the same multi-scale structure that an FPN synthesizes with 11M trained parameters.

The construction is machine-checked in Rocq/HoTT ( CofiberDecomposition.v ). The proof frames average pooling and bilinear upsampling as an adjoint pair whose counit gives a short exact sequence in a semi-additive category; the cofiber bands are the kernels of the projections, and the sum is exact by construction.

Best Results (COCO val2017)
Variant Params mAP [email protected] [email protected] Category
split_tower_192h_5std_4dw 4,016,441 20.7 28.5 22.8 trained
split_tower_224h_3std_6dw 3,849,657 20.3 28.1 22.3 trained
conv_deep_p3_lateral 4,269,785 19.9 28.4 22.0 trained
conv_deep_p3 3,972,569 19.7 28.3 21.6 trained
conv_deep_3.38M 3,381,592 18.8 27.4 20.9 trained
conv_deep_912k 911,960 17.2 25.6 19.2 trained
evolved_deep 182,580 10.6 18.9 10.8 trained
spatialreg_92k 91,960 8.2 25.7 2.8 trained
box32_92k 91,640 5.9 21.4 1.3 trained
box32 pruned R2 ~62,000 nz 5.9 20.4 1.5 trained
dim20 22,076 3.9 14.8 0.9 trained
analytical_70k 69,976 1.6 6.0 0.4 analytical
evolved K=100 person 105 1.3 5.8 0.1 circuit
Baseline FCOS (non-cofiber) 16,138,074 41.0 64.8 43.2 reference

The best split_tower head reaches 20.7 mAP with 4.02M parameters — 50.5% of the FCOS baseline's 41.0 mAP at 24.9% of its parameters. The architecture has separate classification and regression towers, each consisting of 5 standard 3×3 convolutions followed by 4 depthwise residual blocks at 192 hidden channels, operating on cofiber-decomposed features with a stride-8 P3 level and top-down lateral connections. An earlier variant at 224 hidden channels with 3 standard + 6 depthwise layers reached 20.3 mAP at 3.85M parameters; narrowing the channels while adding more cross-channel-mixing standard convolutions gave the +0.4 mAP improvement.

Repository Structure
`analytical/`
Path Description
analytical_70k/ Closed-form least-squares head. 70K params, 1.6 mAP, zero training
analytical_h1/ Sheaf cohomology (H^1) features. Experimental
variants/ Exotic feature experiments (quadratic, RFF, Fourier, fractal) with result JSONs
scripts/ analytical_greedy_gpu.py , analytical_exotic_gpu.py , analytical_empbayes.py , etc.
`trained/`
Path Params mAP Description
split_tower/ 4.02M 20.7 Split cls/reg towers with standard + depthwise hybrid. Current best
conv_deep/ 912K-4.27M 17.2-19.9 Depthwise residual stack variants (scaled, P3, lateral)
evolved_deep/ 182K 10.6 10-layer MLP on 92 evolutionarily-selected dims
spatialreg_92k/ 92K 8.2 3x3 depthwise conv on regression output
linear_70k/ 70K 5.2 Trained linear classifier
box32_92k/ 92K 5.9 INT8 threshold logic circuit + pruned variants (46K-76K)
box32_distilled/ 92K Self-distillation of box32
dim_sweeps/ 9K-80K 0.3-? SVD-initialized fixed-dim heads (5, 10, 15, 20, 30, 80)
sloe/ 0.0 Spectral Laplacian object emergence (failed experiment)
person_specialist/ 9K Person-only detector
waldo_specialist/ 5K Waldo-finding detector
experimental_scaffolds/ Untrained architectural scaffolds (5scale, adaptive, centernet, linear)
`circuit/`
File Description
person_analytical.pth Person classifier at 93 parameters, 99.8% recall
person_detector.sv , cofiber_detector.sv Verilog implementations
rom/*.hex INT8 weight ROMs
evolved_K100_person_eval.json Evolutionary search result, 105 params, 1.3 mAP
tb_person.sv Testbench
`scripts/`
Script Target
train_split_tower.py Split tower (best)
train_conv_deep.py Conv deep family (912K-4.27M)
train_evolved_deep.py Evolved deep on 92 dims
eval_conv_deep_step.py Eval any conv_deep checkpoint
eval_evolved_deep.py Eval evolved_deep checkpoint
eval_coco_map.py Generic COCO mAP eval
`CofiberDecomposition.v`

Rocq/HoTT machine-checked proof that the cofiber decomposition is exact in a semi-additive category: every input decomposes uniquely as a sum of scale bands with zero cross-term residual.

Scaling Curve

The relationship between head parameters and mAP is approximately logarithmic across four orders of magnitude:

      105 params → 1.3 mAP  (evolved circuit, person only)
       70K params → 1.6 mAP  (analytical closed-form)
       92K params → 8.2 mAP  (depthwise conv on regression)
      182K params → 10.6 mAP (evolved dim selection + 10-layer MLP)
      912K params → 17.2 mAP (depthwise conv stack)
     3.97M params → 19.7 mAP (with stride-8 P3)
     3.85M params → 20.3 mAP (split cls/reg towers, 3 std + 6 dw at 224 hidden)
     4.02M params → 20.7 mAP (split cls/reg towers, 5 std + 4 dw at 192 hidden)
    16.14M params → 41.0 mAP (FCOS baseline with FPN)
Broader Detection Work

Non-cofiber detection heads (FCOS baseline, untrained architectural variants, alternative formulations) are hosted in phanerozoic/detection-heads , which also includes the top-performing cofiber head (split_tower) for reference. This repository is the canonical host for cofiber-based detection research.

License

Fair Research License. See LICENSE .

Runs of phanerozoic cofiber-detection on huggingface.co

576
Total runs
0
24-hour runs
2
3-day runs
94
7-day runs
259
30-day runs

More Information About cofiber-detection huggingface.co Model

More cofiber-detection license Visit here:

https://choosealicense.com/licenses/fair-research-license

cofiber-detection huggingface.co

cofiber-detection huggingface.co is an AI model on huggingface.co that provides cofiber-detection's model effect (), which can be used instantly with this phanerozoic cofiber-detection model. huggingface.co supports a free trial of the cofiber-detection model, and also provides paid use of the cofiber-detection. Support call cofiber-detection model through api, including Node.js, Python, http.

cofiber-detection huggingface.co Url

https://huggingface.co/phanerozoic/cofiber-detection

phanerozoic cofiber-detection online free

cofiber-detection huggingface.co is an online trial and call api platform, which integrates cofiber-detection's modeling effects, including api services, and provides a free online trial of cofiber-detection, you can try cofiber-detection online for free by clicking the link below.

phanerozoic cofiber-detection online free url in huggingface.co:

https://huggingface.co/phanerozoic/cofiber-detection

cofiber-detection install

cofiber-detection is an open source model from GitHub that offers a free installation service, and any user can find cofiber-detection on GitHub to install. At the same time, huggingface.co provides the effect of cofiber-detection install, users can directly use cofiber-detection installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

cofiber-detection install url in huggingface.co:

https://huggingface.co/phanerozoic/cofiber-detection

Url of cofiber-detection

cofiber-detection huggingface.co Url

Provider of cofiber-detection huggingface.co

phanerozoic
ORGANIZATIONS

Other API from phanerozoic

huggingface.co

Total runs: 1.2K
Run Growth: 1.2K
Growth Rate: 100.00%
Updated:April 25 2026