Logo for Archeo Futurus, Inc Archeo Futurus
less cloud, more service

What is Archeo Futurus?

We provide cloud services like CDN, Edge Computing, and Real-time Database. We can do EVERYTHING existing cloud providers do but faster, more reliable, and more secure.

Why Archeo Futurus?

  • Fast
  • Reliable
  • Secure
  • Easy to use
  • Great Customer Service
  • Reduced Overhead Cost
  • Greater Speed / Performance
  • Lower Latency
  • Proven Security
  • Positive Environmental Impact

Technology Overview

Abstract

Novel tools allow the usage of FGPAs in contexts previously relegated to CPUs with thick piles of abstractions. Formal verification obviates the need for operating systems, their security model, and abstraction layers. Provably optimized computation models allows the usage of ANY programming language to target ANY platform, and create the smallest/fastest implementation.
US Patent 9,996,328

Novel Tools

Once upon a time, a suite of tools was designed for efficiency and generality. Over time the efficiency reached the mathematical limits of possibility, and the generality expanded to cover all computable problems. These tools have been used to recreate the full vertical solutions that solve some of the most common use cases. The full power of these tools is apparent when applied to new problems. The architect’s interaction with these tools is far different than the standard development model today. The architect brings in representations of the standards used for communication, all of the edges of the black box, then defines the required behavior using relational specifications. The end product is a computational model that can be targeted at CPU, GPU, FPGA, or ASIC.

Formal Verification

The tools can ingest an arbitrary number inputs in any programming language, then compare and contrast against Formal Specifications, lists of requirements, and test suites. The usefulness of Formal Verification comes from the ability to say with certainty that an implementation matches the expectations set by a specification or previous implementation. Formal Verification allows for whole classes of mistakes to be prevented and addressed. The lists of those mistakes is incredibly long, the corollary is those attributes of the implementation that are provable. Attributes include: quality of service, response time, limits, hardware usage, and correctness. Formal Verification can show some usefulness in shining light on the shortcomings of some standard specifications. But, like the 4Bn IPv4 address limit, it cannot overcome the limits placed by the specification writers.

Operating Systems

Operating Systems provide many things to many different groups of people, but the most important, by far, are those things that they provide to developers. Beauty and symmetry are valued far less than ease of use and understanding of existing developer tools. A system for development would languish for a generation or more if made out of whole cloth. The abstraction layers provided are etched in the minds of students, and only well-reasoned about by seasoned professionals. The security models, and their implications are as well assimilated slowly, more so over again because lack of their understanding does not impede the addition of new features. Today, at this moment, there are countless efforts to remove operating systems as platforms of development, be that Web Standards, NodeJS, or cross platform development tools. The days of separate teams developing for each operating system are coming to a close.

Security Model

Of the various security models in common use (layers, obscurity, code review, bug bounties, etc.), none offer any proof of effectiveness, absolutely or probabilistically. Of those systematic behaviours desired to be enabled, and those others to be prevented, no current security model offers any peace of mind. The best security implementation is a large team of people, with watchful eyes looking for patterns unsuspected, or some well-trained artificial intelligence reporting such. Quick recognition and response is the gold-standard. Using the new tools, using Formal Verification, it is possible to prevent, absolutely, undesired behaviours, and to enable, absolutely and with defined quality of service, desired behaviours. However, what those definitions of desirability are must still be subject to quick recognition and response by a competent team.

Abstraction Layers

Abstraction layers are a necessity for the thought patterns of people, as a matter of pshycology. No person can give meaningful thought and attention to the countless atoms in a glass of water, while simply trying to drink it. Abstractions layers bring focus, and allow a common language among developers solving problems... they are the vocabulary of the language used to describe the solution to so many problems. Abstraction layers also cause endless headaches as they are leaky, ill-fitting, slow to change, and poorly defined. The new tools build abstractions using symbolic logic, relationships, and fully internally described verbs and nouns. This allows the power to leverage the work held in common to solve solutions while removing the headaches.

Provably optimized computation models

Optimization has far too many definitions... Specifically, in terms of transistor count, memory used, code size, power usage, run time, latency, throughput, quality of service, maintenance, developer hours, etc. Some of these are mutually exclusive, algorithms exist that are specifically designed to make space/runtime tradeoffs. Some of these are only probabilistic in nature and measure, while others are finite and deterministic. For all of these areas of optimization, a more general model would be to weigh these in relation to each other, by a means for each cost, each area has a dollar value for example. In the tools, there is a general constraint solver that can give a probabilistic proof of optimization for any mix of multiple probabilistic and deterministic areas. The tools can also give an absolute proof of optimization in many deterministic areas, when they do not conflict.

ANY programming language

The tools can ingest any programming language, formal specification, etc. The greatest advance is the ease at which new domain-specific languages can be created. Building new languages specifically tailored to the problem at hand allows the briefest of description of the problem and solution, in terms code size and developer hours. Given the Formal Verification for correctness and security, and the provable optimization tools, architects and developers can quickly prototype with new specifications and languages to solve problems. Once a verified solution is created to a given problem, that problem never again needs to be addressed, no better solution can be found. Only adding features over time would be needed, no patches or correction would ever need to exist.

ANY platform

Any physical or virtual computation device can be targeted, some platforms already used are: Javascript in a browser, Windows, Linux, Bare-x86/x64, OpenGL compute, OpenCL, VHDL, FPGA, and simulated transistors. The only limitations on platform are those connections with the outside world required such as GPS or internet connection not being on a graphing calculator, sufficient memory to contain the whole of the computation machine, and the speed needed to operate as desired. Cross platform development is the rule, every solution created can be applied to every platform that meets the constraints as designed.

Smallest/Fastest Implementation

In some cases, many cases even, it is possible to create both the smallest in terms of silicon real estate, and fastest in terms of latency and throughput. There are a class of problems that are well suited to such optimization. The problems inside of this class include a significant subset of computable problems, the boundaries of that subset are subject to NDA, but do include the problems solved by the complete stacks of commonly used software today. What is possible is measured in orders of magnitude... improvements in needed transistors, transistor transitions, energy/heat, latency, throughput, density. All of these combine together to change the economics so drastically as to be comparable to the transition from vacuum tubes to transistors.

Design, Deploy and Protect

DDoS Protection

Distributed Denial of Service protection in depth. DDoS attacks are growing every year, we remove that worry.

Load Testing & Balancing

Load testing shows where improvements need to be made and give the confidence of knowing how much load can be handled.

WAF

Web Application Firewall
Protect vulnerable backend servers from application attacks.

DNS Serving & Resolving

Domain Name System
Authoritative Serving and Recursive Resolving

Hosting / CDN

Hosting combined with a powerful Content Distribution Network

Transit

Dedicated Internet Access over Fiber and Copper with a robust set of global transit connections.

BGP Anycast

Border Gateway Protocol Anycast with custom session and routing

VPN

Virtual Private Network protects the content of connections. Combined with robust peering and transit providers allows for fewer intermediaries.

RUM

Real User Measurement is the new standard of measuring performance at the device level of your end users.

Realtime Messaging

Messaging between users of your website allows for games and chat

WebSockets

Modern games and lively web applications communicate using WebSockets for high-speed and low-latency communications

WebRTC

Web Real Time Communications is used for video and audio communications directly between website users

Roadmap

Stage 1 - Deploy into 5 data centers

Data centers selected for how robust their internet exchanges are

  1. Seattle, Washington
  2. San Jose, California
  3. Los Angeles, California
  4. Chicago, Illinois
  5. Ashburn, Virginia
  6. Complete deployment spare for quick replacement

Future expansion

  • Stage 2 - 20 Locations - Saturate North America
  • Stage 3 - 10 Locations - Europe and Near East
  • Stage 4 - 10 Locations - Asia-Pacific and Australia
  • Stage 5 - 10 Locations - South America and Africa