Search results

Filters

  • Journals
  • Authors
  • Keywords
  • Date
  • Type

Search results

Number of results: 3
items per page: 25 50 75
Sort by:
Download PDF Download RIS Download Bibtex

Abstract

Audio data compression is used to reduce the transmission bandwidth and storage requirements of audio data. It is the second stage in the audio mastering process with audio equalization being the first stage. Compression algorithms such as BSAC, MP3 and AAC are used as standards in this paper. The challenge faced in audio compression is compressing the signal at low bit rates. The previous algorithms which work well at low bit rates cannot be dominant at higher bit rates and vice-versa. This paper proposes an altered form of vector quantization algorithm which produces a scalable bit stream which has a number of fine layers of audio fidelity. This modified form of the vector quantization algorithm is used to generate a perceptually audio coder which is scalable and uses the quantization and encoding stages which are responsible for the psychoacoustic and arithmetical terminations that are actually detached as practically all the data detached during the prediction phases at the encoder side is supplemented towards the audio signal at decoder stage. Therefore, clearly the quantization phase which is modified to produce a bit stream which is scalable. This modified algorithm works well at both lower and higher bit rates. Subjective evaluations were done by audio professionals using the MUSHRA test and the mean normalized scores at various bit rates was noted and compared with the previous algorithms.
Go to article

Authors and Affiliations

Shajin Prince
1
Bini D
1
A Alfred Kirubaraj
1
J Samson Immanuel
1
Surya M
1

  1. Karunya Institute of Technology and Sciences, Coimbatore, India
Download PDF Download RIS Download Bibtex

Abstract

The impact of complexity within government and societal systems is considered relative to the limitations of human cognitive bandwidth, and the resulting reliance on cognitive biases and systems of automation when that bandwidth is exceeded. Examples of how humans and societies have attempted to cope with the growing difference between the rate at which the complexity of systems and human cognitive capacities increase respectively are considered. The potential of and urgent need for systems capable of handling the existing and future complexity of systems, utilizing greater cognitive bandwidth through scalable AGI, are also considered, along with the practical limitations and considerations in how those systems may be deployed in real-world conditions. Several paradoxes resulting from the influence of prolific Narrow Tool AI systems manipulating large portions of the population are also noted
Go to article

Authors and Affiliations

Kyrtin Atreides
1

  1. AGI Laboratory, Seattle, WA,USA
Download PDF Download RIS Download Bibtex

Abstract

This paper models the downlink Fifth Generation (5G) network that supports a flexible frame structure and a shorter Round-Trip Time (RTT) for Hybrid Automatic Repeat Request (HARQ). Moreover, the design of the renowned Time Division Multiple Access (TDMA) packet scheduling algorithms is revised to allow these algorithms to support packet scheduling in the downlink 5G. Simulation results demonstrate that the Proportional Fair provides a comparable performance to the delay–aware Maximum-Largest Weighted Delay First for simultaneously providing the desired transmission reliability of the Guaranteed Bit Rate (GBR) and Non-Guaranteed Bit Rate (Non- GBR) healthcare contents whilst maximizing the downlink 5G performance.

Go to article

Authors and Affiliations

Huda Adibah Mohd Ramli

This page uses 'cookies'. Learn more