{"id":13528732,"url":"https://github.com/cauchyturing/UCR_Time_Series_Classification_Deep_Learning_Baseline","last_synced_at":"2025-04-01T14:32:57.602Z","repository":{"id":42461586,"uuid":"73790321","full_name":"cauchyturing/UCR_Time_Series_Classification_Deep_Learning_Baseline","owner":"cauchyturing","description":"Fully Convlutional Neural Networks for state-of-the-art time series classification","archived":false,"fork":false,"pushed_at":"2019-07-03T00:12:57.000Z","size":1850,"stargazers_count":677,"open_issues_count":12,"forks_count":203,"subscribers_count":27,"default_branch":"master","last_synced_at":"2024-11-02T15:36:14.043Z","etag":null,"topics":["convolutional-networks","deep-learning","time-series"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/cauchyturing.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2016-11-15T07:58:55.000Z","updated_at":"2024-10-31T10:17:27.000Z","dependencies_parsed_at":"2022-09-13T05:02:36.137Z","dependency_job_id":null,"html_url":"https://github.com/cauchyturing/UCR_Time_Series_Classification_Deep_Learning_Baseline","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cauchyturing%2FUCR_Time_Series_Classification_Deep_Learning_Baseline","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cauchyturing%2FUCR_Time_Series_Classification_Deep_Learning_Baseline/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cauchyturing%2FUCR_Time_Series_Classification_Deep_Learning_Baseline/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cauchyturing%2FUCR_Time_Series_Classification_Deep_Learning_Baseline/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/cauchyturing","download_url":"https://codeload.github.com/cauchyturing/UCR_Time_Series_Classification_Deep_Learning_Baseline/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246655241,"owners_count":20812605,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["convolutional-networks","deep-learning","time-series"],"created_at":"2024-08-01T07:00:23.494Z","updated_at":"2025-04-01T14:32:55.462Z","avatar_url":"https://github.com/cauchyturing.png","language":"Python","readme":"# Deep Learning for Time Series Classification \nAs the simplest type of time series\ndata, univariate time series provides a reasonably good starting\npoint to study the temporal signals. The representation\nlearning and classification research has found many potential\napplication in the fields like finance, industry, and health care. Common similarity measures like Dynamic Time Warping (DTW) or the Euclidean Distance (ED) are decades old. Recent efforts on different feature engineering and distance measures designing give much higher accuracy on the UCR time series classification benchmarks (like BOSS [[1]](http://link.springer.com/article/10.1007%2Fs10618-015-0441-y),[[2]](http://link.springer.com/article/10.1007%2Fs10618-014-0377-7), PROP [[3]](http://link.springer.com/article/10.1007/s10618-014-0361-2) and COTE [[4]](http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7069254)) but also let to the pitfalls of higher complexity and interpretability. \n\nThe exploition on the deep neural networks, especially\nconvolutional neural networks (CNN) for end-to-end time\nseries classification are also under active exploration like multi-channel CNN (MC-CNN) [[5]](http://link.springer.com/article/10.1007/s11704-015-4478-2)\nand multi-scale CNN (MCNN) [[6]](https://arxiv.org/abs/1603.06995). However, they still need heavy preprocessing and a large set of hyperparameters which would make\nthe model complicated to deploy. \n\nThis repository contains three deep neural networks models (MLP, FCN and ResNet) for the pure end-to-end and interpretable time series analytics. These models provide a good baseline for both application for real-world data and future research in deep learning on time series.\n\n## Before Start\nWhat is the best approach to classfiy time series? Very hard to say. From the experiments we did, COTE, BOSS are among the best and DL-based appraoch (FCN, ResNet or MCNN) show no significant difference with them. If you prefer white box model, try BOSS first. If you like end-to-end solution, use FCN or even MLP with dropout as your fisrt baseline (FCN also support a certain level of model interpretability as from CAM or grad-CAM). \n\nHowever, the UCR time series is kind of the 'extremely ideal data'. In a more applicable scenario, highly skewed labels with very non-stationary dynamics and frequent distribution/concept drift occur everywhere. Hopefully we can address these more complex issue with a very neat and effective DL based framework to enable end-2-end solution with good model interpretability , and yeah, we are exactly working on it.   \n\n## Network Structure\n![Network Structure](Archi.jpg)\nThree deep neural network architectures are exploited to provide a fully comprehensive baseline.\n\n## Localize the Contributing Region with Class Activation Map \nAnother benefit of FCN and ResNet with the global average pooling\nlayer is its natural extension, the class activation map (CAM)\nto interpret the class-specific region in the data [[7]](https://arxiv.org/abs/1512.04150).\n![CAM](CAM.jpg)\n\nWe can see that the discriminative\nregions of the time series for the right classes are highlighted.\nWe also highlight the differences in the CAMs for the different\nlabels. The contributing regions for different categories are\ndifferent. The CAM provides a natural way to find out the contributing\nregion in the raw data for the specific labels. This enables\nclassification-trained convolutional networks to learn to localize\nwithout any extra effort. Class activation maps also allow\nus to visualize the predicted class scores on any given time\nseries, highlighting the discriminative subsequences detected\nby the convolutional networks. CAM also provide a way to\nfind a possible explanation on how the convolutional networks\nwork for the setting of classification.\n\n## Visualize the Filter/Weights \nWe adopt the Gramian Angular Summation Field (GASF)\n[[8]](https://arxiv.org/abs/1506.00327) to visualize the filters/weights in the neural networks. The weights from the second and the\nlast layer in MLP are very similar with clear structures and\nvery little degradation occurring. The weights in the first layer,\ngenerally, have the higher values than the following layers.\n![Feature](Feature.jpg)\n\n## Classification Results\nThis table provides the testing (not training) classification error rate on 85 UCR time series data sets. For more experimental settings please refer to our paper. \n\nPlease note that the 'best' row is not a reasonable peformance measure. The MPCE score is TODO.\n\n|                                |       |        |       |       |         |          |         |       | \n|--------------------------------|-------|--------|-------|-------|---------|----------|---------|-------| \n|       | **MLP**                            | **FCN**   | **ResNet** | **PROP**  | **COTE**  | **1NN-DTW** | **1NN-BOSS** | **BOSS-VS**|\n| 50words                        | 0.288 | 0.321  | 0.273 | 0.180 | 0.191   | 0.310    | 0.301   | 0.367 | \n| Adiac                          | 0.248 | 0.143  | 0.174 | 0.353 | 0.233   | 0.396    | 0.220   | 0.302 | \n| ArrowHead                      | 0.177 | 0.120  | 0.183 | 0.103 | /       | 0.337    | 0.143   | 0.171 | \n| Beef                           | 0.167 | 0.25   | 0.233 | 0.367 | 0.133   | 0.367    | 0.200   | 0.267 | \n| BeetleFly                      | 0.150 | 0.050  | 0.200 | 0.400 | /       | 0.300    | 0.100   | 0.000 | \n| BirdChicken                    | 0.200 | 0.050  | 0.100 | 0.350 | /       | 0.250    | 0.000   | 0.100 | \n| Car                            | 0.167 | 0.083  | 0.067 | /     | /       | /        | /       | /     | \n| CBF                            | 0.14  | 0      | 0.006 | 0.002 | 0.001   | 0.003    | 0       | 0.001 | \n| ChlorineCon                    | 0.128 | 0.157  | 0.172 | 0.360 | 0.314   | 0.352    | 0.340   | 0.345 | \n| CinCECGTorso                   | 0.158 | 0.187  | 0.229 | 0.062 | 0.064   | 0.349    | 0.125   | 0.130 | \n| Coffee                         | 0     | 0      | 0     | 0     | 0       | 0        | 0       | 0.036 | \n| Computers                      | 0.460 | 0.152  | 0.176 | 0.116 |         | 0.300    | 0.296   | 0.324 | \n| CricketX                       | 0.431 | 0.185  | 0.179 | 0.203 | 0.154   | 0.246    | 0.259   | 0.346 | \n| CricketY                       | 0.405 | 0.208  | 0.195 | 0.156 | 0.167   | 0.256    | 0.208   | 0.328 | \n| CricketZ                       | 0.408 | 0.187  | 0.187 | 0.156 | 0.128   | 0.246    | 0.246   | 0.313 | \n| DiatomSizeR                    | 0.036 | 0.07   | 0.069 | 0.059 | 0.082   | 0.033    | 0.046   | 0.036 | \n| DistalPhalanxOutlineAgeGroup   | 0.173 | 0.165  | 0.202 | 0.223 | /       | 0.208    | 0.180   | 0.155 | \n| DistalPhalanxOutlineCorrect    | 0.190 | 0.188  | 0.180 | 0.232 | /       | 0.232    | 0.208   | 0.282 | \n| DistalPhalanxTW                | 0.253 | 0.210  | 0.260 | 0.317 | /       | 0.290    | 0.223   | 0.253 | \n| Earthquakes                    | 0.208 | 0.199  | 0.214 | 0.281 | /       | 0.258    | 0.186   | 0.193 | \n| ECG200                         | 0.080 | 0.100  | 0.130 | /     | /       | 0.230    | 0.130   | 0.180 | \n| ECG5000                        | 0.065 | 0.059  | 0.069 | 0.350 | /       | 0.250    | 0.056   | 0.110 | \n| ECGFiveDays                    | 0.03  | 0.015  | 0.045 | 0.178 | 0       | 0.232    | 0.000   | 0.000 | \n| ElectricDevices                | 0.420 | 0.277  | 0.272 | 0.277 | /       | 0.399    | 0.282   | 0.351 | \n| FaceAll                        | 0.115 | 0.071  | 0.166 | 0.152 | 0.105   | 0.192    | 0.210   | 0.241 | \n| FaceFour                       | 0.17  | 0.068  | 0.068 | 0.091 | 0.091   | 0.170    | 0       | 0.034 | \n| FacesUCR                       | 0.185 | 0.052  | 0.042 | 0.063 | 0.057   | 0.095    | 0.042   | 0.103 | \n| fish                           | 0.126 | 0.029  | 0.011 | 0.034 | 0.029   | 0.177    | 0.011   | 0.017 | \n| FordA                          | 0.231 | 0.094  | 0.072 | 0.182 | /       | 0.438    | 0.083   | 0.096 | \n| FordB                          | 0.371 | 0.117  | 0.100 | 0.265 | /       | 0.406    | 0.109   | 0.111 | \n| GunPoint                       | 0.067 | 0      | 0.007 | 0.007 | 0.007   | 0.093    | 0       | 0     | \n| Ham                            | 0.286 | 0.238  | 0.219 | /     | /       | 0.533    | 0.343   | 0.286 | \n| HandOutlines                   | 0.193 | 0.224  | 0.139 | /     | /       | 0.202    | 0.130   | 0.152 | \n| Haptics                        | 0.539 | 0.449  | 0.494 | 0.584 | 0.481   | 0.623    | 0.536   | 0.584 | \n| Herring                        | 0.313 | 0.297  | 0.406 | 0.079 | /       | 0.469    | 0.375   | 0.406 | \n| InlineSkate                    | 0.649 | 0.589  | 0.635 | 0.567 | 0.551   | 0.616    | 0.511   | 0.573 | \n| InsectWingbeatSound            | 0.369 | 0.598  | 0.469 | /     | /       | 0.645    | 0.479   | 0.430 | \n| ItalyPower                     | 0.034 | 0.03   | 0.040 | 0.039 | 0.036   | 0.050    | 0.053   | 0.086 | \n| LargeKitchenAppliances         | 0.520 | 0.104  | 0.107 | 0.232 | /       | 0.205    | 0.280   | 0.304 | \n| Lightning2                     | 0.279 | 0.197  | 0.246 | 0.115 | 0.164   | 0.131    | 0.148   | 0.262 | \n| Lightning7                     | 0.356 | 0.137  | 0.164 | 0.233 | 0.247   | 0.274    | 0.342   | 0.288 | \n| MALLAT                         | 0.064 | 0.02   | 0.021 | 0.050 | 0.036   | 0.066    | 0.058   | 0.064 | \n| Meat                           | 0.067 | 0.033  | 0.000 | /     | /       | 0.067    | 0.100   | 0.167 | \n| MedicalImages                  | 0.271 | 0.208  | 0.228 | 0.245 | 0.258   | 0.263    | 0.288   | 0.474 | \n| MiddlePhalanxOutlineAgeGroup   | 0.265 | 0.232  | 0.240 | 0.474 | /       | 0.250    | 0.218   | 0.253 | \n| MiddlePhalanxOutlineCorrect    | 0.240 | 0.205  | 0.207 | 0.210 | /       | 0.352    | 0.255   | 0.350 | \n| MiddlePhalanxTW                | 0.391 | 0.388  | 0.393 | 0.630 | /       | 0.416    | 0.373   | 0.414 | \n| MoteStrain                     | 0.131 | 0.05   | 0.105 | 0.114 | 0.085   | 0.165    | 0.073   | 0.115 | \n| NonInvThorax1                  | 0.058 | 0.039  | 0.052 | 0.178 | 0.093   | 0.210    | 0.161   | 0.169 | \n| NonInvThorax2                  | 0.057 | 0.045  | 0.049 | 0.112 | 0.073   | 0.135    | 0.101   | 0.118 | \n| OliveOil                       | 0.60  | 0.167  | 0.133 | 0.133 | 0.100   | 0.167    | 0.100   | 0.133 | \n| OSULeaf                        | 0.43  | 0.012  | 0.021 | 0.194 | 0.145   | 0.409    | 0.012   | 0.074 | \n| PhalangesOutlinesCorrect       | 0.170 | 0.174  | 0.175 | /     | /       | 0.272    | 0.217   | 0.317 | \n| Phoneme                        | 0.902 | 0.655  | 0.676 | /     | /       | 0.772    | 0.733   | 0.825 | \n| Plane                          | 0.019 | 0      | 0     |       | /       | /        | /       | /     | \n| ProximalPhalanxOutlineAgeGroup | 0.176 | 0.151  | 0.151 | 0.117 | /       | 0.195    | 0.137   | 0.244 | \n| ProximalPhalanxOutlineCorrect  | 0.113 | 0.100  | 0.082 | 0.172 | /       | 0.216    | 0.131   | 0.134 | \n| ProximalPhalanxTW              | 0.203 | 0.190  | 0.193 | 0.244 | /       | 0.263    | 0.203   | 0.248 | \n| RefrigerationDevices           | 0.629 | 0.467  | 0.472 | 0.424 | /       | 0.536    | 0.512   | 0.488 | \n| ScreenType                     | 0.592 | 0.333  | 0.293 | 0.440 | /       | 0.603    | 0.544   | 0.464 | \n| ShapeletSim                    | 0.517 | 0.133  | 0.000 | /     | /       | 0.350    | 0.044   | 0.022 | \n| ShapesAll                      | 0.225 | 0.102  | 0.088 | 0.187 | /       | 0.232    | 0.082   | 0.188 | \n| SmallKitchenAppliances         | 0.611 | 0.197  | 0.203 | 0.187 | /       | 0.357    | 0.200   | 0.221 | \n| SonyAIBORobot                  | 0.273 | 0.032  | 0.015 | 0.293 | 0.146   | 0.275    | 0.321   | 0.265 | \n| SonyAIBORobotII                | 0.161 | 0.038  | 0.038 | 0.124 | 0.076   | 0.169    | 0.098   | 0.188 | \n| StarLightCurves                | 0.043 | 0.033  | 0.025 | 0.079 | 0.031   | 0.093    | 0.021   | 0.096 | \n| Strawberry                     | 0.033 | 0.031  | 0.042 | /     | /       | 0.060    | 0.042   | 0.024 | \n| SwedishLeaf                    | 0.107 | 0.034  | 0.042 | 0.085 | 0.046   | 0.208    | 0.072   | 0.141 | \n| Symbols                        | 0.147 | 0.038  | 0.128 | 0.049 | 0.046   | 0.050    | 0.032   | 0.029 | \n| SyntheticControl               | 0.05  | 0.01   | 0.000 | 0.010 | 0.000   | 0.007    | 0.030   | 0.040 | \n| ToeSegmentation1               | 0.399 | 0.031  | 0.035 | 0.079 | /       | 0.228    | 0.048   | 0.031 | \n| ToeSegmentation2               | 0.254 | 0.085  | 0.138 | 0.085 | /       | 0.162    | 0.038   | 0.069 | \n| Trace                          | 0.18  | 0      | 0     | 0.010 | 0.010   | 0        | 0       | 0     | \n| TwoLeadECG                     | 0.147 | 0      | 0     | 0.067 | 0.015   | 0.096    | 0.016   | 0.001 | \n| TwoPatterns                    | 0.114 | 0.103  | 0     | 0     | 0       | 0        | 0.004   | 0.015 | \n| UWaveGestureLibraryAll         | 0.046 | 0.174  | 0.132 | 0.199 | 0.196   | 0.272    | 0.241   | 0.270 | \n| UWaveX                         | 0.232 | 0.246  | 0.213 | 0.283 | 0.267   | 0.366    | 0.313   | 0.364 | \n| UWaveY                         | 0.297 | 0.275  | 0.332 | 0.290 | 0.265   | 0.342    | 0.312   | 0.336 | \n| UWaveZ                         | 0.295 | 0.271  | 0.245 | 0.029 | /       | 0.108    | 0.059   | 0.098 | \n| wafer                          | 0.004 | 0.003  | 0.003 | 0.003 | 0.001   | 0.020    | 0.001   | 0.001 | \n| Wine                           | 0.204 | 0.111  | 0.204 | /     | /       | 0.426    | 0.167   | 0.296 | \n| WordSynonyms                   | 0.406 | 0.42   | 0.368 | 0.226 | /       | 0.252    | 0.345   | 0.491 | \n| Worms                          | 0.657 | 0.331  | 0.381 | /     | /       | 0.536    | 0.392   | 0.398 | \n| WormsTwoClass                  | 0.403 | 0.271  | 0.265 | /     | /       | 0.337    | 0.243   | 0.315 | \n| yoga                           | 0.145 | 0.155  | 0.142 | 0.121 | 0.113   | 0.164    | 0.081   | 0.169 | \n| Best | 6 | 27 | 21 | 14 | 10 | 4 | 21 | 9 | \n\n## Dependencies\nKeras (Tensorflow backend), Numpy.\n\n## Cite\nIf you find either the codes or the results are helpful to your work, please kindly cite our paper \n\n[**Time Series Classification from Scratch with Deep\nNeural Networks: A Strong Baseline**] (https://arxiv.org/abs/1611.06455)\n\n[**Imaging Time-Series to Improve Classification and Imputation**] (https://arxiv.org/abs/1506.00327)\n\n## License\nThis project is licensed under the MIT License.\n\nMIT License\n\nCopyright (c) [2019] [Zhiguang Wang]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n","funding_links":[],"categories":["Examples or singular models","💻 Repos with Models"],"sub_categories":["Managed database services"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcauchyturing%2FUCR_Time_Series_Classification_Deep_Learning_Baseline","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcauchyturing%2FUCR_Time_Series_Classification_Deep_Learning_Baseline","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcauchyturing%2FUCR_Time_Series_Classification_Deep_Learning_Baseline/lists"}