What is Reiki?
What is Reiki?
Reiki was originally taught by its founder, Mikao Usui, in Japan during the 1920’s as a spiritual path for those seeking enlightenment, and as a method of self-healing.
Today in the West, Reiki is primarily thought of as a form of complimentary medicine or spiritual healing.