The Archive
The Archive (Organization)
The archive is an organization that operates outside of traditional national boundaries. It has various purposes, but the key focus is on the preservation and dissemination of knowledge. The archive houses all information one could need, and builds governance and structures around it’s access. From there own words:
We are the worlds foremost knowledge providers. Here at the archive we have a record of every piece of fiction and non-fiction you can imagine. We work hard to ensure that as much information as possible is available to the public in order to allow people to make informed choices. The archive stands as a bastion of public progress, and through our consort program we help our governments make the correct decisions.
- Jamie Han’tretal, 8th Head Coordinator of the archive
Nodes
Each node is simply a local replica of a portion of the knowledge in the archive. They can be thought of as similar to libraries. Nodes themselves will contain various artifacts (similar to a museum), as well as connections to the archive (machine), and it is where consorts and coordinators operate from. It is important to note that nodes access is generally open, except for a few cases to do with future content. When looking for documents that have not yet been written the system will lock you out if you search for your own writing. Likewise the system will not allow you to search for the future works of close friends. How it figures out these relationships is unknown.
Originally the node system was developed to provide a local cache of content that was most recently requested from the main archive. Once the first node was attached however it was noted that the outgoing traffic to the node was records that were initially selected to “seed” the local caches. In laymen’s terms the main archive knew which records would be requested and would push them to the node to be stored before anyone had searched for it. In fact the system would update it’s daily pushes according to the technological capabilities of each node. As such each node followed a similar architecture to the main archive wherein large storage and very little computational capability were optimized for.
This caching system has had many theories tossed around regarding how it works. For example it has been theorized by some to be inverse-causally linked to what was requested. Dr. Jennifer Morheimer wrote “We are assuming the archive does this through a complex predictive algorithm, but this system is a complete black box. For all we know it could be implanting what it thinks we want to look for before we even know. After all it will store incorrect searches that people will make. However even these ‘mistakes’ will typically yield a positive result. It’s as if the device is more of a teacher than a historian, and all of us it’s eager students”.
This caching system seems to reinforce the theories surrounding the archive, and it’s role in our history. If the searches are pre-calculated, then it would make sense that they could be pre-cached before the searches are even made. This pre-calculation includes aggregate data, such as if a scientist were to request the number of records in a given collection, or analysis with complex equations. The results of this would be instantaneous because it would have been pre-calculated at an unspecified date.
The original intention was to allow only the consorts and coordinators access to the information. Unfortunately the archive itself modifies our systems whenever we try to lock it down. Likewise the device interferes with any of our systems (cameras, sensors etc.) if we use them to limit access to the nodes. This has lead many to suggest that the archive itself “wants” to be read. However it has also been noted that in times of despotism the archive will actively provide information to aid in toppling the government if they try to limit the access to the archive. In the interest of not presenting a harm to the archive’s (organization) access to the archive (machine) we do not interfere with this process, and leave access nodes open to the public. We also advise that any leader or government does not take steps to limit any access to the archive (machine and organization).
Coordinators
Coordinators are the leaders and operators of each Node in the archive. Coordinators are tasked with overseeing operations at each node including running audits (security, financial, privacy etc.), managing consorts, and will often have the final say in decisions at each given location. Coordinators have a wide variety of backgrounds, and are selected by the node itself. The process is somewhat opaque to the public, but from reports of prior Coordinators there are several steps:
- The Coordinator will receive a message with the name of a consort
- The Coordinator will then begin the process of training, and overseeing them more closely than other consorts
- Eventually the Node will send a confirmation email appointing them to the position of coordinator
To date there has not been a single Coordinator candidate who has not been appointed, it is unclear if this means it’s impossible to fail, or if it just has not yet happened.
Consorts
The consort program is one in which gifted students are given the opportunity to make a difference. Once selected students will go through the consort apprenticeship program. This program will consist of rigorous study in order to become an official coordinator. Consorts provide consultation services to various governments, organizations, and businesses to help them make the correct decisions. Each node will have a coordinator who is in charge of the various consorts under them.
Consort Collections
Consort collections are collections of documents focused around a theme. Throughout the consort apprenticeship program many of these collections will be assigned as reading material. These collections make up the backbone of what distinguishes consorts from normal people. The knowledge inside is meant to help enlighten consorts into the people the world needs them to be.
The Archive (Machine)
While the organization is called the archive, there is also a machine called the archive. The machine is at the heart of the archive’s (organization) goal. The machine is a universal database of research, fiction, journalism, art and everything in between. Every artifact and piece of literature in existence or that will be in existence will have a record. The archive (organization) does not allow direct querying of the archive (machine), instead this is done through nodes.
The current theory is that the archive is purely pre-calculated (see inverse causality theory for details). The hardware is incredibly underpowered, except for it’s storage capabilities. The system itself seems to have all known (and unknown) knowledge, but it “decides” when to reveal certain items. For example, it has been noted through experimentation that the doctorate thesis of Dr. Hishimodo was available to researchers 2 years prior to it’s official publication. This information was made available to personnel at Darcon, as well as the Shekland Minister of war who used it to combat Eyes Wide Shut operations in the region. This has lead to speculation around the purpose of the archive, and what this revelation is meant to showcase. Originally it was theorized it is only able to calculate short distances in time, meaning it could determine future documents, but only within a short range.
As time has gone on theories have expanded, and presented a potentially more morbid explanation. It seems that the archive itself is constructed of incredibly modest hardware. In fact the hardware was relatively low power for even the timeframe of it’s discovery. This suggests it was built to be “easy to turn on”. The entire system uses less power than most refrigerators, and seems to primarily be a collection of dense memory. It’s computational prowess is very lack-luster. Upon further digging there have been documents found millennia into the future which document the machines construction. This has lead to the theory that the machine itself is out of place in time.
The current primary theory is that the archive is a device built at a much later time, and then sent backwards with the records of all interactions it will perform. These interactions are written in reverse on the machine and “played forward”. Essentially it does not respond to the environment around it, instead it’s “responses” are completely pre-calculated from it’s last interaction backwards, and then stored in memory. It is also speculated that the archive is not guaranteed to exist. Several documents seem to suggest that the archive in it’s current iteration is something we must strive towards constructing. If we fail to make the advancements “on schedule” then the device will cease to function correctly. This theory seems directly at odds with the deterministic nature of the machine, but is heavily suggested in the (now deleted) letters from it’s creator.
Link to original
The Kliever model
During the Q riots there was a proposal put forward to all the nations of the world. Dr. Carnell Kliever was a graduate student at the time, and was working on a notion of governance that allowed for more people to be heard, while having a stable authority. Within each nation various commissioners would be established. These commissioners would be put in charge of an “aspect” (justice, commerce, arts etc.). The way they are brought to power is irrelevant to the system, but there are a few rules:
- Each person is only allowed to be the commissioner of 1 aspect
- Each aspect is provided equal funding from a budget, and negotiations must be made between aspects for further funding
- Commissioners are assigned coordinators by the archive, commissioners have authority to make decisions, however if more than %60 of the commissioners assigned deny the change it becomes a public referendum to be completed in no less than 60 days from being denied. If more than 6 decisions are denied within a year a referendum is held to replace or retain the Commissioner
- Optionally a monarch may also be named. This process requires at least %80 of commissioners to vote in the affirmative to institute a monarch. In nations with monarchs they have full veto powers, except if more than %60 of the commissioners deny the veto, if this is the case a referendum will be held for the decision 30 days from when denied, and must be completed within 90 days. If more than 6 decisions are denied within a year a referendum is held to replace or retain the monarch
- Commissioners are allowed up to 12 ministers underneath them, these ministers will focus on a particular area and if %60 or more ministers get together they can veto an action taken by a minister. Ministers are not a requirement, and mostly exist in nations with large and complicated government responsibilities. Once a ministry position is named it is the only position that is democratically elected via a public vote
Since the introduction of this model, globally 37 commissioners have been replaced, and 4 monarchs. There have been many critiques of the Kliever model, including:
- It does not specify any municipal structure, meaning cities can be run however is seen fit. In an interview Ci’ran Bal’main, a coordinator for The Archive stated “This allows for tyranny at one of the most impactful levels of government for most people. Kliever by not specifying municipal government has shown his hand for his true intentions. There is an indifference to the individual, or community, the only interest for Kliever is the sex appeal of being a savior to all nations.”
- Due to turmoil at the time it was logistically impossible to irradicate monarchies. As such often times allowing monarchs still allows for despots and authoritarian regimes. Karmin Alezstruza notes “Kliever simply shifts the problem one rung down the ladder. The Q riots were about accountability. Instead we got a spineless system that simply requires despots to intimidate a handful of other people before they can get their will exacted.”
- There are disputes about topics that overlap aspects. If a law also effects the arts should the commissioner of justice or the commissioner of arts be in charge of the decision. There are no formal rules about how each is chosen, and this can lead to issues. “Jerrymandering has become a national pastime. The endless bickering about categorization leads to an indulgent bureaucracy that spends most of it’s time justifying who is more important. An endless pissing contest at the behest of the people the commissioners are meant to help” a quote from Ci’ran Bal’main.
Since implementation the Kliever model is the system implemented by all nations. Typically then within nations are several territories, and within those territories are cities and towns. It is customary to have at least 1 archive node per city/town, though larger cities and towns may have multiple nodes to meet demand. Additionally the Kliever model grants political immunity from conflict to nodes of the archive, similar to embassies. However any party privy to the conflict (soldiers, politicians etc.) who try to abuse this safety are authorized for execution by The Archive.
Link to original
Memoizer's
Memoizer’s are devices that allow someone to experience another cognitive state. Effectively it’s a method for being able to record and playback the qualia of a circumstance. This is used for many purposes such as safety training, entertainment, and teaching. With modern advancements there are several methods of capture, some creating the states artificially, and others using experience engineers.
Experience Engineering
During research into consciousness Dr. Reinart Saelzar discovered a way to “store” conscious states. These states could then be embedded into early versions of what we now call neuro-cognitive plastics. Using these plastics brain states could be stored, and with the help of a memoizer they could be “replayed”. Typically these early devices were used to capture patient data to compare on a computer to a baseline. This comparison was used to check neurocognitive function, and as a diagnostic tool for mental health issues.
Early prototype of a cognitive gel, still very similar in structure to a human brain
One of Dr. Saelzar’s research assistants (Dr. Melinda Carthwright) decided to work on getting playback to work inside a human mind. Her first successful test allowed people to experience a sunset at the top of a mountain near to the lab. A new type of media was born, experiences. Experience(s) is/are the name(s) given to these recordings, and the people recording are called experience engineers (originally called “slates” though this is now considered derogatory). These initial models were incredibly clunky and often “desynced”. Typically it would require the experience engineer to sit incredibly still, with small movements to look around. These first methods were a 1:1 recreation, and there were several initial issues:
- Limited range of movement since the environment couldn’t be captured
- Took incredible amounts of storage for relatively short experiences
- Engineers were not used to suppressing their thoughts and would often end up leaking personal details including passwords, addresses, phone numbers, and pin numbers
- Traditional actors would often get injured and then continue a take, but this pain was translated directly into experiencers
- Limited fault tolerance meant a single mistake would require reshooting an experience
- Dissociation leading to a loss of personal identity
- “Ghosting” were after an experience a sensation would linger
- Loss of movement in appendages after an experience with an amputee experience engineer
- Lingering depression after experiences with a clinically depressed experience engineer
From here the work of dozens of scientists including Dr. Mark Seaborne who helped develop a much more efficient method of recording experiences. This lead to the more complex “experience engineering” we see today. Most importantly Dr. Seaborne realized that you only needed to record portions of the brain, and simple sensations like heat, vibration, cool etc. were better simulated by external systems. So instead of simulating what heat feels like for example the user can opt to wear a suit which will emulate the feeling in their own body. This means all the data about the experience can be omitted.
A more modern cognitive gel attached to a minimizer gen 2 by The Org
There are also more artificial methods of experience generation. Companies like ExperiMax now provide libraries for experience directors so that they can scan an environment and dynamically generate the experiences based on the scans. For example if you have a grassy field you can specify the air temperature, moisture, type of grass, and a photogrametric scan of the area. From there when a person enters the experience they are able to walk around inside the experience, and live out the experience themselves.
Following the first Shekland massacre, and the mass redistribution of several terrorist group recruitment experiences the use of memoizers is heavily regulated. Personal use is now neigh-impossible, and experience theatres are one of the only ways to feasibly experience.
Link to original