Posts

50)Sheikh Hasina the right choice as key guest to the Republic Day but trust the PMO to miss the obvious

India Republic Day -- Coming from a record 10 Chief Guests for the Republic Day March in 2018 to none in 2021 is as significantly a reflection on Prime Minister Narendra Modis out of the package approach to foreign policy. Originating from a record 10 Chief Guests for the Republic Day March in 2018 to none in 2021 is as significantly a reflection on Prime Minister Narendra Modis out of the package approach to foreign policy since his blind spots while zeroing in on an incredible foreign dignitary. Sheikh Hasina Prime Minister of Bangladesh would have been the perfect Republic Day Chief Guest this year for umpteen reasons however it obviously didnt occur to Modi to single her out and about for the honour. I shiver to even speculate whether a visionary and statesman like Modi is blinded simply by her religion or male or female or both to pass your ex up? Instead of inviting British PM Boris Johnson who all ultimately chickened out Hasina should have been Modis automated choice this

What Makes a Professional Minimalist Logo Design?

When you are in the process of creating your company's logo, one of the first questions that may come to mind is what sort of professional minimalist logo design should you use. This is certainly an appropriate question to raise when you are still in the planning stages of your logo, because there are certainly many different styles and formats that are available for your use. In this article we'll briefly look at a few of the options open to you when it comes to your choice in logo design . o An advertorial - Every company, small or large, will find that there are certain issues in the way that they present themselves to the public, which can make their branding much easier if they use some kind of advertorial, or at least a short advertisement. The main advantage here is that your company's logo - and therefore your brand - will be featured in a short, concise and memorable format. It is also a format that can easily be understood by the human eye. o A website - Companie

Data compression

Image
In signal processing, data compression , source coding , or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compression reduces bits by identifying and eliminating statistical redundancy. No information is lost in lossless compression. Lossy compression reduces bits by removing unnecessary or less important information. Typically, a device that performs data compression is referred to as an encoder, and one that performs the reversal of the process (decompression) as a decoder. The process of reducing the size of a data file is often referred to as data compression. In the context of data transmission, it is called source coding; encoding done at the source of the data before it is stored or transmitted. Source coding should not be confused with channel coding, for error detection and correction or line coding, the means for mapping data onto a signal. Compress

Lossless

Image
Lossless data compression algorithms usually exploit statistical redundancy to represent data without losing any information, so that the process is reversible. Lossless compression is possible because most real-world data exhibits statistical redundancy. For example, an image may have areas of color that do not change over several pixels; instead of coding "red pixel, red pixel, ..." the data may be encoded as "279 red pixels". This is a basic example of run-length encoding; there are many schemes to reduce file size by eliminating redundancy. The Lempel–Ziv (LZ) compression methods are among the most popular algorithms for lossless storage. DEFLATE is a variation on LZ optimized for decompression speed and compression ratio, but compression can be slow. In the mid-1980s, following work by Terry Welch, the Lempel–Ziv–Welch (LZW) algorithm rapidly became the method of choice for most general-purpose compression systems. LZW is used in GIF images, programs such as PK

Lossy

Image
In the late 1980s, digital images became more common, and standards for lossless image compression emerged. In the early 1990s, lossy compression methods began to be widely used. In these schemes, some loss of information is accepted as dropping nonessential detail can save storage space. There is a corresponding trade-off between preserving information and reducing size. Lossy data compression schemes are designed by research on how people perceive the data in question. For example, the human eye is more sensitive to subtle variations in luminance than it is to the variations in color. JPEG image compression works in part by rounding off nonessential bits of information. A number of popular compression formats exploit these perceptual differences, including psychoacoustics for sound, and psychovisuals for images and video. Most forms of lossy compression are based on transform coding, especially the discrete cosine transform (DCT). It was first proposed in 1972 by Nasir Ahmed, who th

Theory

Image
The theoretical basis for compression is provided by information theory and, more specifically, algorithmic information theory for lossless compression and rate–distortion theory for lossy compression. These areas of study were essentially created by Claude Shannon, who published fundamental papers on the topic in the late 1940s and early 1950s. Other topics associated with compression include coding theory and statistical inference. Machine learning edit There is a close connection between machine learning and compression. A system that predicts the posterior probabilities of a sequence given its entire history can be used for optimal data compression (by using arithmetic coding on the output distribution). An optimal compressor can be used for prediction (by finding the symbol that compresses best, given the previous history). This equivalence has been used as a justification for using data compression as a benchmark for "general intelligence". An alternative view can show

Uses

Image
Image edit Entropy coding originated in the 1940s with the introduction of Shannon–Fano coding, the basis for Huffman coding which was developed in 1950. Transform coding dates back to the late 1960s, with the introduction of fast Fourier transform (FFT) coding in 1968 and the Hadamard transform in 1969. An important image compression technique is the discrete cosine transform (DCT), a technique developed in the early 1970s. DCT is the basis for JPEG, a lossy compression format which was introduced by the Joint Photographic Experts Group (JPEG) in 1992. JPEG greatly reduces the amount of data required to represent an image at the cost of a relatively small reduction in image quality and has become the most widely used image file format. Its highly efficient DCT-based compression algorithm was largely responsible for the wide proliferation of digital images and digital photos. Lempel–Ziv–Welch (LZW) is a lossless compression algorithm developed in 1984. It is used in the GIF format, int