Reoptimization for Great Power Competition

Reoptimization for Great Power Competition

Space Force Blue Background Graphic

 

 
Department of the Air Force
 

 

 

 

“I’m extremely proud of the Space Force and all the good it has accomplished. But, as good as we are, as much as we’ve done, as far as we’ve come, it’s not enough. We are not yet optimized for Great Power Competition.”

~ Chief of Space Operations
Gen. Chance Saltzman 

Space Force & Air Force announce sweeping changes to maintain superiority amid Great Power Competition

The establishment of the U.S. Space Force was a direct response to threats arising from Great Power Competition in the space domain. Nevertheless, our legacy roots leave us sub-optimized for the security environment confronting us today, and we must finish fine-tuning the service to continue meeting its National Defense Strategy responsibilities

In early 2024, the Department of the Air Force unveiled sweeping plans for reshaping, refocusing, and reoptimizing the Air Force and Space Force to ensure continued supremacy in their respective domains while better posturing the services to deter and, if necessary, prevail in an era of Great Power Competition. Through a series of 24 DAF-wide key decisions, four core areas which demand the Department’s attention will be addressed: Develop People, Generate Readiness, Project Power and Develop Capabilities.

The space domain is no longer benign; it has rapidly become congested and contested.

We must enhance our capabilities, develop Guardians for modern warfare, prepare for the high intensity fight, and strengthen our power projection to thrive and win in this new era of Great Power Competition.

 

Video by Kenneth M McNulty, Kevin D Schmidt
Michael Robinson - Topological Features in Large Language Models (and beyond?)
Air Force Research Laboratory
Oct. 11, 2024 | 01:00:53
In this edition of QuEST, Michael Robinson will discuss topological features in large language models

Key Moments and Questions in the video include:
Acknowledgement of colleagues from DARPA and Galois
Manifolds in machine learning
LLM token space is higher dimensional
Manifold spaces tend to be negatively curved
LLM turn text into vectors
Transformers turn vectors into new text
How do we turn the text into vectors?
We think of LLM as being trained on all human language, but they have not
GPT2 Open source LLM as the source for model
ChatGPT2 used as the example
Tokens have topology and geometry
Words are a categorical variable
Vectors are a numerical variable
Mixing data types can lead to some problems
Why care about the token space?
Not all tokens correspond to a valid vector
Estimating dimensions
Volume of a sphere
Log of Volume vs log of radius curves
Ricci scalar curvature
Stratifications are visible
GPT2 uses a state space that is not a manifold
Dollar sign shown different in GPT2 because the $ is used in code where other currency symbols are not
GPT2’s 768 dimensions unwrapped using tSNE
Tokens with leading spaces
Beginnings of words show up in separate piece of low dimension
Visual similarity to hyperbolic plane
LLEMMA7B dimensions
Plotting dimension
Dark space are non-printing characters
Thinking about how neural activation patterns work
We have been thinking about manifold learning out of mathematical convenience
State spaces are not manifolds
Open presentation to conversation
More
Air Force Great Power Competition

 

 

 
Department of the Air Force