Search results:
Found 3
Listing 1  3 of 3 
Sort by

Choose an application
Image analysis is a fundamental task for extracting information from images acquired across a range of different devices. Since reliable quantitative results are requested, image analysis requires highly sophisticated numerical and analytical methods—particularly for applications in medicine, security, and remote sensing, where the results of the processing may consist of vitally important data. The contributions to this book provide a good overview of the most important demands and solutions concerning this research area. In particular, the reader will find image analysis applied for feature extraction, encryption and decryption of data, color segmentation, and in the support new technologies. In all the contributions, entropy plays a pivotal role.
image retrieval  multifeature fusion  entropy  relevance feedback  chaotic system  image encryption  permutationdiffusion  SHA256 hash value  dynamic index  entropy  keyframes  Shannon’s entropy  sign languages  video summarization  video skimming  image encryption  multipleimage encryption  twodimensional chaotic economic map  security analysis  image encryption  chaotic cryptography  cryptanalysis  chosenplaintext attack  image information entropy  blind image quality assessment (BIQA)  information entropy, natural scene statistics (NSS)  Weibull statistics  discrete cosine transform (DCT)  ultrasound  hepatic steatosis  Shannon entropy  fatty liver  metabolic syndrome  multiexposure image fusion  texture information entropy  adaptive selection  patch structure decomposition  image encryption  timedelay  random insertion  information entropy  chaotic map  uncertainty assessment  deep neural network  random forest  Shannon entropy  positron emission tomography  reconstruction  field of experts  additive manufacturing  3D prints  3D scanning  image entropy  depth maps  surface quality assessment  machine vision  image analysis  Arimoto entropy  freeform deformations  normalized divergence measure  gradient distributions  nonextensive entropy  nonrigid registration  pavement  macrotexture  3D digital imaging  entropy  decay trend  discrete entropy  infrared images  low contrast  multiscale tophat transform  image encryption  DNA encoding  chaotic cryptography  cryptanalysis  image privacy  computer aided diagnostics  colonoscopy  Rényi entropies  structural entropy  spatial filling factor  binary image  Cantor set  Hénon map  Minkowski island  primeindexed primes  Ramanujan primes  Kapur’s entropy  color image segmentation  whale optimization algorithm  differential evolution  hybrid algorithm  Otsu method  image encryption  dynamic filtering  DNA computing  3D Latin cube  permutation  diffusion  fuzzy entropy  electromagnetic field optimization  chaotic strategy  color image segmentation  multilevel thresholding  contrast enhancement  sigmoid  Tsallis statistics  qexponential  qsigmoid  qGaussian  ultrasound images  person reidentification  image analysis  hash layer  quantization loss  Hamming distance  crossentropy loss  image entropy  Shannon entropy  generalized entropies  image processing  image segmentation  medical imaging  remote sensing  security
Choose an application
As the ultimate information processing device, the brain naturally lends itself to being studied with information theory. The application of information theory to neuroscience has spurred the development of principled theories of brain function, and has led to advances in the study of consciousness, as well as to the development of analytical techniques to crack the neural code—that is, to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling the precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative testing of hypotheses about how the brain encodes and transmits the information used for specific functions across areas. This Special Issue presents twelve original contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience.
neural network  Potts model  latching  recursion  functional connectome  graph theoretical analysis  eigenvector centrality  orderness  network eigenentropy  information entropy production  discrete Markov chains  spike train statistics  Gibbs measures  maximum entropy principle  pulsegating  channel capacity  neural coding  feedforward networks  neural information propagation  information theory  mutual information decomposition  synergy  redundancy  integrated information theory  integrated information  minimum information partition  submodularity  Queyranne’s algorithm  consciousness  maximum entropy  higherorder correlations  neural population coding  Ising model  brain network  complex networks  connectome  information theory  graph theory  freeenergy principle  internal model hypothesis  unconscious inference  infomax principle  independent component analysis  principal component analysis  goodness  categorical perception  perceptual magnet  information theory  perceived similarity  mutual information  synergy  redundancy  neural code  hippocampus  entorhinal cortex  navigation  neural code  representation  decoding  spiketime precision  discrimination  noise correlations  information theory  mismatched decoding  information theory  neuroscience
Choose an application
Entropy theory has wide applications to a range of problems in the fields of environmental and water engineering, including river hydraulic geometry, fluvial hydraulics, water monitoring network design, river flow forecasting, floods and droughts, river network analysis, infiltration, soil moisture, sediment transport, surface water and groundwater quality modeling, ecosystems modeling, water distribution networks, environmental and water resources management, and parameter estimation. Such applications have used several different entropy formulations, such as Shannon, Tsallis, Reacutenyi Burg, Kolmogorov, Kapur, configurational, and relative entropies, which can be derived in time, space, or frequency domains. More recently, entropybased concepts have been coupled with other theories, including copula and wavelets, to study various issues associated with environmental and water resources systems. Recent studies indicate the enormous scope and potential of entropy theory in advancing research in the fields of environmental and water engineering, including establishing and explaining physical connections between theory and reality. The objective of this Special Issue is to provide a platform for compiling important recent and current research on the applications of entropy theory in environmental and water engineering. The contributions to this Special Issue have addressed many aspects associated with entropy theory applications and have shown the enormous scope and potential of entropy theory in advancing research in the fields of environmental and water engineering.
complexity  streamflow  water level  composite multiscale sample entropy  trend  Poyang Lake basin  fourparameter exponential gamma distribution  principle of maximum entropy  precipitation frequency analysis  methods of moments  maximum likelihood estimation  flood frequency analysis  generalized gamma (GG) distribution  principle of maximum entropy (POME)  entropy theory  principle of maximum entropy (POME)  GB2 distribution  flood frequency analysis  nonpoint source pollution  ANN  entropy weighting method  datascarce  multievents  spatiotemporal variability  soil water content  entropy  arid region  joint entropy  NDVI  temperature  precipitation  groundwater depth  Hei River basin  turbulent flow  canopy flow  randomness  coherent structures  Shannon entropy  Kolmogorov complexity  entropy  information transfer  optimization  radar  rainfall network  water resource carrying capacity  forewarning model  entropy of information  fuzzy analytic hierarchy process  projection pursuit  accelerating genetic algorithm  entropy production  conditional entropy production  stochastic processes  scaling  climacogram  turbulence  water resources vulnerability  connection entropy  changing environment  set pair analysis  Anhui Province  crossentropy minimization  land suitability evaluation  spatial optimization  monthly streamflow forecasting  Burg entropy  configurational entropy  entropy spectral analysis time series analysis  entropy  water monitoring  network design  hydrometric network  information theory  entropy applications  hydrological risk analysis  maximum entropycopula method  uncertainty  Loess Plateau  entropy  water engineering  Tsallis entropy  principle of maximum entropy  Lagrangian function  probability distribution function  flux concentration relation  uncertainty  information  informational entropy  variation of information  continuous probability distribution functions  confidence intervals  precipitation  variability  marginal entropy  crop yield  Hexi corridor  flow duration curve  Shannon entropy  entropy parameter  modeling  spatial and dynamics characteristic  hydrology  tropical rainfall  statistical scaling  Tsallis entropy  multiplicative cascades  BetaLognormal model  rainfall forecast  cross entropy  ant colony fuzzy clustering  combined forecast  information entropy  mutual information  kernel density estimation  ENSO  nonlinear relation  scaling laws  power laws  water distribution networks  robustness  flow entropy  entropy theory  frequency analysis  hydrometeorological extremes  Bayesian technique  rainfall  entropy ensemble filter  ensemble model simulation criterion  EEF method  bootstrap aggregating  bagging  bootstrap neural networks  El Niño  ENSO  neural network forecast  sea surface temperature  tropical Pacific  entropy  cross elasticity  mean annual runoff  water resources  resilience  quaternary catchment  complement  substitute  entropy theory  complex systems  hydraulics  hydrology  water engineering  environmental engineering
Listing 1  3 of 3 
Sort by

2019 (3)