- Apologies! I confused Weeks 1 and 2, so I took notes on the Week 1 readings and labeled them as Week 2. I know this is now late, but Week 2 notes are coming!
Wikipedia article - "Computer Hardware"
- Hardware = "collection of physical elements that comprise a computer system" (performs input and output, stores data, manages all of these tasks together)
- History: separate manual actions --> punchcards --> stored-program computers
- Tied to history of computer data storage
- History: analog computers (mechanical circuit models for electrical circuits) --> digital (no models or analogs)
- Mainframe computers, minicomputers, microcomputers (personal computers)
- Von Neumann architecture = processing unit with arithmetic logic unit and processor registers, control unit with instruction register and program counter, memory to store data and instructions, external mass storage, and input and output mechanisms
There is definitely a lot that I do not know about computer hardware, but I do understand the difference between it and computer software. It is also helpful to have learned that the specific tasks that hardware performs can be summarized as input/output, storage, and management.
Wikipedia article - "Computer Software"
- Software = "a collection of computer programs and related data that provides the instructions for telling a computer what to do and how to do it"
- Programs, procedures, algorithms, and its documentation
- "Cannot be touched"
- Software used to always be bundled with hardware, now software is its own business
- Software licensing issues, patents
- Types of software: system, programming, application, middleware, teachware, testware, firmware, shrinkware, device drivers
- Types of architecture: platform, application, user-written
- Free software license = recipients can modify and redistribute software
I had previously thought of software just in terms of "application" software, so now I understand that the term encompasses much more than that. Again, there is still a ton that I do not know about computer software, but I think that I do have a good basic understanding of what I need to know.
Digitization: Is It Worth It?
- Digitization = "the conversion of analog media to digital form"
- Downsides: long process, complex, many things can go wrong
- Cost-benefit analysis? Pay for conversion itself but also assembling material, copyright licenses, machine upkeep, editing, cataloging, managing
- Upsides: increasing access, preservation, increase visibility of institution
- Digitizing vs. not digitizing: if item is in demand enough, digitizing becomes cost-effective
- Digitizing vs. acquiring new materials: depends on many factors - digitizing is not always the right choice
- CONCLUSION: Each case should be treated separately.
European Libraries Face Problems in Digitalizing
- European Digital Library as a competitor to Google Books project
- Huge cost of digitizing materials
- Originally had only limited public funds, now seeking private alliances
- Business model?
- Who controls the future of digital records and writings?
- Alliance with Google itself?
- Charging for access to copyrighted materials? Low-quality for free but higher quality for a fee?
Wikipedia article - "Data Compression"
- Data compression (or source coding or bit-rate reduction) = "encoding information using fewer bits than the original representation would use," makes use of redundancy
- Reduces consumption of hard disk space or transmission bandwidth
- Must be decompressed to be used (may need certain hardware)
- Lossless compression = no error, but slow compression/decompression (Lempel-Ziv or LV)
- Lossy compression = some error (depending on how much error is acceptable), faster compression/decompression
- Data compression comes from theories of machine learning and data differencing
- Audio data compression - considers the design and function of the human ear
- Video data compression - considers the design and function of the human eye
I remember reading a little about this topic in The Information: A History, a Theory, a Flood that we read in LIS 2000 last semester. At that time I was amazed at the way that data compression makes use of redundancy, mostly because we hardly notice this redundancy in our daily lives. Once you start looking for redundancy, though, you see that it is everywhere. I am someone who is very interested in psychology, so the way that data compression algorithms have to consider the way the human brain and body work is fascinating to me.
No comments:
Post a Comment